00:00:00.001 Started by upstream project "autotest-per-patch" build number 124192 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.032 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.033 The recommended git tool is: git 00:00:00.033 using credential 00000000-0000-0000-0000-000000000002 00:00:00.034 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.048 Fetching changes from the remote Git repository 00:00:00.050 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.083 Using shallow fetch with depth 1 00:00:00.083 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.083 > git --version # timeout=10 00:00:00.109 > git --version # 'git version 2.39.2' 00:00:00.109 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.131 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.131 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.367 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.379 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.390 Checking out Revision 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 (FETCH_HEAD) 00:00:03.390 > git config core.sparsecheckout # timeout=10 00:00:03.401 > git read-tree -mu HEAD # timeout=10 00:00:03.417 > git checkout -f 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=5 00:00:03.437 Commit message: "pool: fixes for VisualBuild class" 00:00:03.437 > git rev-list --no-walk 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=10 00:00:03.539 [Pipeline] Start of Pipeline 00:00:03.553 [Pipeline] library 00:00:03.555 Loading library shm_lib@master 00:00:07.499 Library shm_lib@master is cached. Copying from home. 00:00:07.529 [Pipeline] node 00:00:07.625 Running on CYP6 in /var/jenkins/workspace/crypto-phy-autotest 00:00:07.627 [Pipeline] { 00:00:07.649 [Pipeline] catchError 00:00:07.652 [Pipeline] { 00:00:07.673 [Pipeline] wrap 00:00:07.689 [Pipeline] { 00:00:07.700 [Pipeline] stage 00:00:07.703 [Pipeline] { (Prologue) 00:00:07.881 [Pipeline] sh 00:00:08.163 + logger -p user.info -t JENKINS-CI 00:00:08.178 [Pipeline] echo 00:00:08.179 Node: CYP6 00:00:08.185 [Pipeline] sh 00:00:08.483 [Pipeline] setCustomBuildProperty 00:00:08.494 [Pipeline] echo 00:00:08.496 Cleanup processes 00:00:08.501 [Pipeline] sh 00:00:08.782 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.782 779066 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.797 [Pipeline] sh 00:00:09.085 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:09.085 ++ grep -v 'sudo pgrep' 00:00:09.085 ++ awk '{print $1}' 00:00:09.085 + sudo kill -9 00:00:09.085 + true 00:00:09.100 [Pipeline] cleanWs 00:00:09.109 [WS-CLEANUP] Deleting project workspace... 00:00:09.110 [WS-CLEANUP] Deferred wipeout is used... 00:00:09.117 [WS-CLEANUP] done 00:00:09.121 [Pipeline] setCustomBuildProperty 00:00:09.134 [Pipeline] sh 00:00:09.415 + sudo git config --global --replace-all safe.directory '*' 00:00:09.493 [Pipeline] nodesByLabel 00:00:09.495 Found a total of 2 nodes with the 'sorcerer' label 00:00:09.506 [Pipeline] httpRequest 00:00:09.511 HttpMethod: GET 00:00:09.512 URL: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:09.515 Sending request to url: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:09.534 Response Code: HTTP/1.1 200 OK 00:00:09.535 Success: Status code 200 is in the accepted range: 200,404 00:00:09.536 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:17.317 [Pipeline] sh 00:00:17.603 + tar --no-same-owner -xf jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:17.621 [Pipeline] httpRequest 00:00:17.625 HttpMethod: GET 00:00:17.626 URL: http://10.211.164.101/packages/spdk_3a44739b7d3100784f7efecc8e3eb1995fd1f244.tar.gz 00:00:17.627 Sending request to url: http://10.211.164.101/packages/spdk_3a44739b7d3100784f7efecc8e3eb1995fd1f244.tar.gz 00:00:17.652 Response Code: HTTP/1.1 200 OK 00:00:17.653 Success: Status code 200 is in the accepted range: 200,404 00:00:17.653 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_3a44739b7d3100784f7efecc8e3eb1995fd1f244.tar.gz 00:01:05.899 [Pipeline] sh 00:01:06.219 + tar --no-same-owner -xf spdk_3a44739b7d3100784f7efecc8e3eb1995fd1f244.tar.gz 00:01:09.531 [Pipeline] sh 00:01:09.815 + git -C spdk log --oneline -n5 00:01:09.815 3a44739b7 nvmf/tcp: move await_req handling to nvmf_tcp_req_put() 00:01:09.815 be02286f6 nvmf: move register nvmf_poll_group_poll interrupt to nvmf 00:01:09.815 9b5203592 nvmf/tcp: replace pending_buf_queue with iobuf callbacks 00:01:09.815 d216ec301 nvmf: extend API to request buffer with iobuf callback 00:01:09.815 9a8d8bdaa nvmf/tcp: use sock group polling for the listening sockets 00:01:09.828 [Pipeline] } 00:01:09.845 [Pipeline] // stage 00:01:09.855 [Pipeline] stage 00:01:09.857 [Pipeline] { (Prepare) 00:01:09.877 [Pipeline] writeFile 00:01:09.895 [Pipeline] sh 00:01:10.181 + logger -p user.info -t JENKINS-CI 00:01:10.193 [Pipeline] sh 00:01:10.481 + logger -p user.info -t JENKINS-CI 00:01:10.493 [Pipeline] sh 00:01:10.772 + cat autorun-spdk.conf 00:01:10.772 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:10.772 SPDK_TEST_BLOCKDEV=1 00:01:10.772 SPDK_TEST_ISAL=1 00:01:10.772 SPDK_TEST_CRYPTO=1 00:01:10.772 SPDK_TEST_REDUCE=1 00:01:10.772 SPDK_TEST_VBDEV_COMPRESS=1 00:01:10.772 SPDK_RUN_UBSAN=1 00:01:10.780 RUN_NIGHTLY=0 00:01:10.786 [Pipeline] readFile 00:01:10.813 [Pipeline] withEnv 00:01:10.816 [Pipeline] { 00:01:10.830 [Pipeline] sh 00:01:11.116 + set -ex 00:01:11.116 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:11.116 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:11.116 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.116 ++ SPDK_TEST_BLOCKDEV=1 00:01:11.116 ++ SPDK_TEST_ISAL=1 00:01:11.116 ++ SPDK_TEST_CRYPTO=1 00:01:11.116 ++ SPDK_TEST_REDUCE=1 00:01:11.116 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.116 ++ SPDK_RUN_UBSAN=1 00:01:11.116 ++ RUN_NIGHTLY=0 00:01:11.116 + case $SPDK_TEST_NVMF_NICS in 00:01:11.116 + DRIVERS= 00:01:11.116 + [[ -n '' ]] 00:01:11.116 + exit 0 00:01:11.126 [Pipeline] } 00:01:11.146 [Pipeline] // withEnv 00:01:11.152 [Pipeline] } 00:01:11.170 [Pipeline] // stage 00:01:11.181 [Pipeline] catchError 00:01:11.183 [Pipeline] { 00:01:11.200 [Pipeline] timeout 00:01:11.200 Timeout set to expire in 40 min 00:01:11.202 [Pipeline] { 00:01:11.220 [Pipeline] stage 00:01:11.222 [Pipeline] { (Tests) 00:01:11.238 [Pipeline] sh 00:01:11.525 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:11.525 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:11.525 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:11.525 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:11.525 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:11.525 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:11.525 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:11.525 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:11.525 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:11.525 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:11.525 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:11.525 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:11.525 + source /etc/os-release 00:01:11.525 ++ NAME='Fedora Linux' 00:01:11.525 ++ VERSION='38 (Cloud Edition)' 00:01:11.525 ++ ID=fedora 00:01:11.525 ++ VERSION_ID=38 00:01:11.525 ++ VERSION_CODENAME= 00:01:11.525 ++ PLATFORM_ID=platform:f38 00:01:11.525 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:11.525 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:11.525 ++ LOGO=fedora-logo-icon 00:01:11.525 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:11.525 ++ HOME_URL=https://fedoraproject.org/ 00:01:11.525 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:11.525 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:11.525 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:11.525 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:11.525 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:11.525 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:11.525 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:11.525 ++ SUPPORT_END=2024-05-14 00:01:11.525 ++ VARIANT='Cloud Edition' 00:01:11.525 ++ VARIANT_ID=cloud 00:01:11.525 + uname -a 00:01:11.525 Linux spdk-CYP-06 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:11.525 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:14.824 Hugepages 00:01:14.824 node hugesize free / total 00:01:14.824 node0 1048576kB 0 / 0 00:01:14.824 node0 2048kB 0 / 0 00:01:14.824 node1 1048576kB 0 / 0 00:01:14.824 node1 2048kB 0 / 0 00:01:14.824 00:01:14.824 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:14.824 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:01:14.824 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:01:15.085 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:15.085 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:01:15.085 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:01:15.085 + rm -f /tmp/spdk-ld-path 00:01:15.085 + source autorun-spdk.conf 00:01:15.085 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.085 ++ SPDK_TEST_BLOCKDEV=1 00:01:15.085 ++ SPDK_TEST_ISAL=1 00:01:15.085 ++ SPDK_TEST_CRYPTO=1 00:01:15.085 ++ SPDK_TEST_REDUCE=1 00:01:15.085 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:15.085 ++ SPDK_RUN_UBSAN=1 00:01:15.085 ++ RUN_NIGHTLY=0 00:01:15.085 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:15.085 + [[ -n '' ]] 00:01:15.085 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:15.085 + for M in /var/spdk/build-*-manifest.txt 00:01:15.085 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:15.085 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:15.085 + for M in /var/spdk/build-*-manifest.txt 00:01:15.085 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:15.085 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:15.085 ++ uname 00:01:15.085 + [[ Linux == \L\i\n\u\x ]] 00:01:15.085 + sudo dmesg -T 00:01:15.085 + sudo dmesg --clear 00:01:15.085 + dmesg_pid=780154 00:01:15.085 + [[ Fedora Linux == FreeBSD ]] 00:01:15.085 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.085 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.085 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:15.085 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:15.085 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:15.085 + [[ -x /usr/src/fio-static/fio ]] 00:01:15.085 + export FIO_BIN=/usr/src/fio-static/fio 00:01:15.085 + FIO_BIN=/usr/src/fio-static/fio 00:01:15.085 + sudo dmesg -Tw 00:01:15.085 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:15.085 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:15.085 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:15.086 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.086 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.086 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:15.086 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.086 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.086 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:15.347 Test configuration: 00:01:15.347 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.347 SPDK_TEST_BLOCKDEV=1 00:01:15.347 SPDK_TEST_ISAL=1 00:01:15.347 SPDK_TEST_CRYPTO=1 00:01:15.347 SPDK_TEST_REDUCE=1 00:01:15.347 SPDK_TEST_VBDEV_COMPRESS=1 00:01:15.347 SPDK_RUN_UBSAN=1 00:01:15.347 RUN_NIGHTLY=0 09:56:37 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:15.347 09:56:37 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:15.347 09:56:37 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:15.347 09:56:37 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:15.347 09:56:37 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.347 09:56:37 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.347 09:56:37 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.347 09:56:37 -- paths/export.sh@5 -- $ export PATH 00:01:15.347 09:56:37 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.347 09:56:37 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:15.347 09:56:37 -- common/autobuild_common.sh@437 -- $ date +%s 00:01:15.347 09:56:37 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718006197.XXXXXX 00:01:15.347 09:56:37 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718006197.zmZ8DB 00:01:15.347 09:56:37 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:01:15.347 09:56:37 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:01:15.347 09:56:37 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:15.347 09:56:37 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:15.347 09:56:37 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:15.347 09:56:37 -- common/autobuild_common.sh@453 -- $ get_config_params 00:01:15.347 09:56:37 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:15.347 09:56:37 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.347 09:56:37 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:15.347 09:56:37 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:01:15.347 09:56:37 -- pm/common@17 -- $ local monitor 00:01:15.347 09:56:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.347 09:56:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.347 09:56:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.347 09:56:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.347 09:56:37 -- pm/common@21 -- $ date +%s 00:01:15.347 09:56:37 -- pm/common@25 -- $ sleep 1 00:01:15.347 09:56:37 -- pm/common@21 -- $ date +%s 00:01:15.347 09:56:37 -- pm/common@21 -- $ date +%s 00:01:15.347 09:56:37 -- pm/common@21 -- $ date +%s 00:01:15.347 09:56:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718006197 00:01:15.347 09:56:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718006197 00:01:15.347 09:56:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718006197 00:01:15.347 09:56:37 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718006197 00:01:15.347 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718006197_collect-vmstat.pm.log 00:01:15.347 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718006197_collect-cpu-load.pm.log 00:01:15.347 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718006197_collect-cpu-temp.pm.log 00:01:15.347 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718006197_collect-bmc-pm.bmc.pm.log 00:01:16.290 09:56:38 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:01:16.290 09:56:38 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:16.290 09:56:38 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:16.290 09:56:38 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:16.290 09:56:38 -- spdk/autobuild.sh@16 -- $ date -u 00:01:16.290 Mon Jun 10 07:56:38 AM UTC 2024 00:01:16.290 09:56:38 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:16.290 v24.09-pre-62-g3a44739b7 00:01:16.290 09:56:38 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:16.290 09:56:38 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:16.290 09:56:38 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:16.290 09:56:38 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:01:16.290 09:56:38 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:01:16.290 09:56:38 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.550 ************************************ 00:01:16.550 START TEST ubsan 00:01:16.550 ************************************ 00:01:16.550 09:56:38 ubsan -- common/autotest_common.sh@1124 -- $ echo 'using ubsan' 00:01:16.550 using ubsan 00:01:16.550 00:01:16.550 real 0m0.001s 00:01:16.550 user 0m0.000s 00:01:16.550 sys 0m0.000s 00:01:16.550 09:56:38 ubsan -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:01:16.550 09:56:38 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:16.550 ************************************ 00:01:16.550 END TEST ubsan 00:01:16.550 ************************************ 00:01:16.550 09:56:38 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:16.550 09:56:38 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:16.550 09:56:38 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:16.550 09:56:38 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:16.550 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:16.550 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:17.122 Using 'verbs' RDMA provider 00:01:33.011 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:45.250 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:45.250 Creating mk/config.mk...done. 00:01:45.250 Creating mk/cc.flags.mk...done. 00:01:45.250 Type 'make' to build. 00:01:45.250 09:57:07 -- spdk/autobuild.sh@69 -- $ run_test make make -j128 00:01:45.250 09:57:07 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:01:45.250 09:57:07 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:01:45.250 09:57:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:45.250 ************************************ 00:01:45.250 START TEST make 00:01:45.250 ************************************ 00:01:45.250 09:57:07 make -- common/autotest_common.sh@1124 -- $ make -j128 00:01:45.822 make[1]: Nothing to be done for 'all'. 00:02:18.033 The Meson build system 00:02:18.033 Version: 1.3.1 00:02:18.033 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:18.033 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:18.033 Build type: native build 00:02:18.033 Program cat found: YES (/usr/bin/cat) 00:02:18.033 Project name: DPDK 00:02:18.033 Project version: 24.03.0 00:02:18.033 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:18.033 C linker for the host machine: cc ld.bfd 2.39-16 00:02:18.033 Host machine cpu family: x86_64 00:02:18.033 Host machine cpu: x86_64 00:02:18.033 Message: ## Building in Developer Mode ## 00:02:18.033 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:18.033 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:18.033 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:18.033 Program python3 found: YES (/usr/bin/python3) 00:02:18.033 Program cat found: YES (/usr/bin/cat) 00:02:18.033 Compiler for C supports arguments -march=native: YES 00:02:18.033 Checking for size of "void *" : 8 00:02:18.033 Checking for size of "void *" : 8 (cached) 00:02:18.033 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:18.033 Library m found: YES 00:02:18.033 Library numa found: YES 00:02:18.033 Has header "numaif.h" : YES 00:02:18.033 Library fdt found: NO 00:02:18.033 Library execinfo found: NO 00:02:18.033 Has header "execinfo.h" : YES 00:02:18.033 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:18.033 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:18.033 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:18.033 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:18.033 Run-time dependency openssl found: YES 3.0.9 00:02:18.033 Run-time dependency libpcap found: YES 1.10.4 00:02:18.033 Has header "pcap.h" with dependency libpcap: YES 00:02:18.033 Compiler for C supports arguments -Wcast-qual: YES 00:02:18.033 Compiler for C supports arguments -Wdeprecated: YES 00:02:18.033 Compiler for C supports arguments -Wformat: YES 00:02:18.033 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:18.033 Compiler for C supports arguments -Wformat-security: NO 00:02:18.033 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:18.033 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:18.033 Compiler for C supports arguments -Wnested-externs: YES 00:02:18.033 Compiler for C supports arguments -Wold-style-definition: YES 00:02:18.033 Compiler for C supports arguments -Wpointer-arith: YES 00:02:18.033 Compiler for C supports arguments -Wsign-compare: YES 00:02:18.033 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:18.033 Compiler for C supports arguments -Wundef: YES 00:02:18.033 Compiler for C supports arguments -Wwrite-strings: YES 00:02:18.033 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:18.033 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:18.033 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:18.033 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:18.033 Program objdump found: YES (/usr/bin/objdump) 00:02:18.033 Compiler for C supports arguments -mavx512f: YES 00:02:18.033 Checking if "AVX512 checking" compiles: YES 00:02:18.033 Fetching value of define "__SSE4_2__" : 1 00:02:18.033 Fetching value of define "__AES__" : 1 00:02:18.033 Fetching value of define "__AVX__" : 1 00:02:18.033 Fetching value of define "__AVX2__" : 1 00:02:18.033 Fetching value of define "__AVX512BW__" : 1 00:02:18.033 Fetching value of define "__AVX512CD__" : 1 00:02:18.033 Fetching value of define "__AVX512DQ__" : 1 00:02:18.033 Fetching value of define "__AVX512F__" : 1 00:02:18.033 Fetching value of define "__AVX512VL__" : 1 00:02:18.033 Fetching value of define "__PCLMUL__" : 1 00:02:18.033 Fetching value of define "__RDRND__" : 1 00:02:18.033 Fetching value of define "__RDSEED__" : 1 00:02:18.033 Fetching value of define "__VPCLMULQDQ__" : 1 00:02:18.033 Fetching value of define "__znver1__" : (undefined) 00:02:18.033 Fetching value of define "__znver2__" : (undefined) 00:02:18.033 Fetching value of define "__znver3__" : (undefined) 00:02:18.033 Fetching value of define "__znver4__" : (undefined) 00:02:18.033 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:18.033 Message: lib/log: Defining dependency "log" 00:02:18.033 Message: lib/kvargs: Defining dependency "kvargs" 00:02:18.033 Message: lib/telemetry: Defining dependency "telemetry" 00:02:18.033 Checking for function "getentropy" : NO 00:02:18.033 Message: lib/eal: Defining dependency "eal" 00:02:18.033 Message: lib/ring: Defining dependency "ring" 00:02:18.033 Message: lib/rcu: Defining dependency "rcu" 00:02:18.033 Message: lib/mempool: Defining dependency "mempool" 00:02:18.033 Message: lib/mbuf: Defining dependency "mbuf" 00:02:18.033 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:18.033 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:18.033 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:18.033 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:18.033 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:18.033 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:02:18.033 Compiler for C supports arguments -mpclmul: YES 00:02:18.033 Compiler for C supports arguments -maes: YES 00:02:18.033 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:18.033 Compiler for C supports arguments -mavx512bw: YES 00:02:18.033 Compiler for C supports arguments -mavx512dq: YES 00:02:18.033 Compiler for C supports arguments -mavx512vl: YES 00:02:18.033 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:18.033 Compiler for C supports arguments -mavx2: YES 00:02:18.033 Compiler for C supports arguments -mavx: YES 00:02:18.033 Message: lib/net: Defining dependency "net" 00:02:18.033 Message: lib/meter: Defining dependency "meter" 00:02:18.033 Message: lib/ethdev: Defining dependency "ethdev" 00:02:18.033 Message: lib/pci: Defining dependency "pci" 00:02:18.033 Message: lib/cmdline: Defining dependency "cmdline" 00:02:18.033 Message: lib/hash: Defining dependency "hash" 00:02:18.033 Message: lib/timer: Defining dependency "timer" 00:02:18.033 Message: lib/compressdev: Defining dependency "compressdev" 00:02:18.033 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:18.033 Message: lib/dmadev: Defining dependency "dmadev" 00:02:18.033 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:18.033 Message: lib/power: Defining dependency "power" 00:02:18.033 Message: lib/reorder: Defining dependency "reorder" 00:02:18.033 Message: lib/security: Defining dependency "security" 00:02:18.033 Has header "linux/userfaultfd.h" : YES 00:02:18.033 Has header "linux/vduse.h" : YES 00:02:18.033 Message: lib/vhost: Defining dependency "vhost" 00:02:18.033 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:18.033 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:18.033 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:18.033 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:18.033 Compiler for C supports arguments -std=c11: YES 00:02:18.033 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:18.033 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:18.033 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:18.033 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:18.033 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:18.033 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:18.033 Library mtcr_ul found: NO 00:02:18.033 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:18.033 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:18.034 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:18.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:18.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:18.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:18.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:18.296 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:18.296 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:18.296 Configuring mlx5_autoconf.h using configuration 00:02:18.296 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:18.296 Run-time dependency libcrypto found: YES 3.0.9 00:02:18.296 Library IPSec_MB found: YES 00:02:18.296 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:18.296 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:18.296 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:18.296 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:18.296 Library IPSec_MB found: YES 00:02:18.296 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:18.296 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:18.296 Compiler for C supports arguments -std=c11: YES (cached) 00:02:18.296 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:18.296 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:18.296 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:18.296 Library libisal found: NO 00:02:18.296 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:18.296 Compiler for C supports arguments -std=c11: YES (cached) 00:02:18.296 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:18.296 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:18.296 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:18.296 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:18.296 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:18.296 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:18.296 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:18.296 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:18.296 Program doxygen found: YES (/usr/bin/doxygen) 00:02:18.296 Configuring doxy-api-html.conf using configuration 00:02:18.296 Configuring doxy-api-man.conf using configuration 00:02:18.296 Program mandb found: YES (/usr/bin/mandb) 00:02:18.296 Program sphinx-build found: NO 00:02:18.296 Configuring rte_build_config.h using configuration 00:02:18.296 Message: 00:02:18.296 ================= 00:02:18.296 Applications Enabled 00:02:18.296 ================= 00:02:18.296 00:02:18.296 apps: 00:02:18.296 00:02:18.296 00:02:18.296 Message: 00:02:18.296 ================= 00:02:18.296 Libraries Enabled 00:02:18.296 ================= 00:02:18.296 00:02:18.296 libs: 00:02:18.297 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:18.297 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:18.297 cryptodev, dmadev, power, reorder, security, vhost, 00:02:18.297 00:02:18.297 Message: 00:02:18.297 =============== 00:02:18.297 Drivers Enabled 00:02:18.297 =============== 00:02:18.297 00:02:18.297 common: 00:02:18.297 mlx5, qat, 00:02:18.297 bus: 00:02:18.297 auxiliary, pci, vdev, 00:02:18.297 mempool: 00:02:18.297 ring, 00:02:18.297 dma: 00:02:18.297 00:02:18.297 net: 00:02:18.297 00:02:18.297 crypto: 00:02:18.297 ipsec_mb, mlx5, 00:02:18.297 compress: 00:02:18.297 isal, mlx5, 00:02:18.297 vdpa: 00:02:18.297 00:02:18.297 00:02:18.297 Message: 00:02:18.297 ================= 00:02:18.297 Content Skipped 00:02:18.297 ================= 00:02:18.297 00:02:18.297 apps: 00:02:18.297 dumpcap: explicitly disabled via build config 00:02:18.297 graph: explicitly disabled via build config 00:02:18.297 pdump: explicitly disabled via build config 00:02:18.297 proc-info: explicitly disabled via build config 00:02:18.297 test-acl: explicitly disabled via build config 00:02:18.297 test-bbdev: explicitly disabled via build config 00:02:18.297 test-cmdline: explicitly disabled via build config 00:02:18.297 test-compress-perf: explicitly disabled via build config 00:02:18.297 test-crypto-perf: explicitly disabled via build config 00:02:18.297 test-dma-perf: explicitly disabled via build config 00:02:18.297 test-eventdev: explicitly disabled via build config 00:02:18.297 test-fib: explicitly disabled via build config 00:02:18.297 test-flow-perf: explicitly disabled via build config 00:02:18.297 test-gpudev: explicitly disabled via build config 00:02:18.297 test-mldev: explicitly disabled via build config 00:02:18.297 test-pipeline: explicitly disabled via build config 00:02:18.297 test-pmd: explicitly disabled via build config 00:02:18.297 test-regex: explicitly disabled via build config 00:02:18.297 test-sad: explicitly disabled via build config 00:02:18.297 test-security-perf: explicitly disabled via build config 00:02:18.297 00:02:18.297 libs: 00:02:18.297 argparse: explicitly disabled via build config 00:02:18.297 metrics: explicitly disabled via build config 00:02:18.297 acl: explicitly disabled via build config 00:02:18.297 bbdev: explicitly disabled via build config 00:02:18.297 bitratestats: explicitly disabled via build config 00:02:18.297 bpf: explicitly disabled via build config 00:02:18.297 cfgfile: explicitly disabled via build config 00:02:18.297 distributor: explicitly disabled via build config 00:02:18.297 efd: explicitly disabled via build config 00:02:18.297 eventdev: explicitly disabled via build config 00:02:18.297 dispatcher: explicitly disabled via build config 00:02:18.297 gpudev: explicitly disabled via build config 00:02:18.297 gro: explicitly disabled via build config 00:02:18.297 gso: explicitly disabled via build config 00:02:18.297 ip_frag: explicitly disabled via build config 00:02:18.297 jobstats: explicitly disabled via build config 00:02:18.297 latencystats: explicitly disabled via build config 00:02:18.297 lpm: explicitly disabled via build config 00:02:18.297 member: explicitly disabled via build config 00:02:18.297 pcapng: explicitly disabled via build config 00:02:18.297 rawdev: explicitly disabled via build config 00:02:18.297 regexdev: explicitly disabled via build config 00:02:18.297 mldev: explicitly disabled via build config 00:02:18.297 rib: explicitly disabled via build config 00:02:18.297 sched: explicitly disabled via build config 00:02:18.297 stack: explicitly disabled via build config 00:02:18.297 ipsec: explicitly disabled via build config 00:02:18.297 pdcp: explicitly disabled via build config 00:02:18.297 fib: explicitly disabled via build config 00:02:18.297 port: explicitly disabled via build config 00:02:18.297 pdump: explicitly disabled via build config 00:02:18.297 table: explicitly disabled via build config 00:02:18.297 pipeline: explicitly disabled via build config 00:02:18.297 graph: explicitly disabled via build config 00:02:18.297 node: explicitly disabled via build config 00:02:18.297 00:02:18.297 drivers: 00:02:18.297 common/cpt: not in enabled drivers build config 00:02:18.297 common/dpaax: not in enabled drivers build config 00:02:18.297 common/iavf: not in enabled drivers build config 00:02:18.297 common/idpf: not in enabled drivers build config 00:02:18.297 common/ionic: not in enabled drivers build config 00:02:18.297 common/mvep: not in enabled drivers build config 00:02:18.297 common/octeontx: not in enabled drivers build config 00:02:18.297 bus/cdx: not in enabled drivers build config 00:02:18.297 bus/dpaa: not in enabled drivers build config 00:02:18.297 bus/fslmc: not in enabled drivers build config 00:02:18.297 bus/ifpga: not in enabled drivers build config 00:02:18.297 bus/platform: not in enabled drivers build config 00:02:18.297 bus/uacce: not in enabled drivers build config 00:02:18.297 bus/vmbus: not in enabled drivers build config 00:02:18.297 common/cnxk: not in enabled drivers build config 00:02:18.297 common/nfp: not in enabled drivers build config 00:02:18.297 common/nitrox: not in enabled drivers build config 00:02:18.297 common/sfc_efx: not in enabled drivers build config 00:02:18.297 mempool/bucket: not in enabled drivers build config 00:02:18.297 mempool/cnxk: not in enabled drivers build config 00:02:18.297 mempool/dpaa: not in enabled drivers build config 00:02:18.297 mempool/dpaa2: not in enabled drivers build config 00:02:18.297 mempool/octeontx: not in enabled drivers build config 00:02:18.297 mempool/stack: not in enabled drivers build config 00:02:18.297 dma/cnxk: not in enabled drivers build config 00:02:18.297 dma/dpaa: not in enabled drivers build config 00:02:18.297 dma/dpaa2: not in enabled drivers build config 00:02:18.297 dma/hisilicon: not in enabled drivers build config 00:02:18.297 dma/idxd: not in enabled drivers build config 00:02:18.297 dma/ioat: not in enabled drivers build config 00:02:18.297 dma/skeleton: not in enabled drivers build config 00:02:18.297 net/af_packet: not in enabled drivers build config 00:02:18.297 net/af_xdp: not in enabled drivers build config 00:02:18.297 net/ark: not in enabled drivers build config 00:02:18.297 net/atlantic: not in enabled drivers build config 00:02:18.297 net/avp: not in enabled drivers build config 00:02:18.297 net/axgbe: not in enabled drivers build config 00:02:18.297 net/bnx2x: not in enabled drivers build config 00:02:18.297 net/bnxt: not in enabled drivers build config 00:02:18.297 net/bonding: not in enabled drivers build config 00:02:18.297 net/cnxk: not in enabled drivers build config 00:02:18.297 net/cpfl: not in enabled drivers build config 00:02:18.297 net/cxgbe: not in enabled drivers build config 00:02:18.297 net/dpaa: not in enabled drivers build config 00:02:18.297 net/dpaa2: not in enabled drivers build config 00:02:18.297 net/e1000: not in enabled drivers build config 00:02:18.297 net/ena: not in enabled drivers build config 00:02:18.297 net/enetc: not in enabled drivers build config 00:02:18.297 net/enetfec: not in enabled drivers build config 00:02:18.297 net/enic: not in enabled drivers build config 00:02:18.297 net/failsafe: not in enabled drivers build config 00:02:18.297 net/fm10k: not in enabled drivers build config 00:02:18.297 net/gve: not in enabled drivers build config 00:02:18.297 net/hinic: not in enabled drivers build config 00:02:18.297 net/hns3: not in enabled drivers build config 00:02:18.297 net/i40e: not in enabled drivers build config 00:02:18.297 net/iavf: not in enabled drivers build config 00:02:18.297 net/ice: not in enabled drivers build config 00:02:18.297 net/idpf: not in enabled drivers build config 00:02:18.297 net/igc: not in enabled drivers build config 00:02:18.297 net/ionic: not in enabled drivers build config 00:02:18.297 net/ipn3ke: not in enabled drivers build config 00:02:18.297 net/ixgbe: not in enabled drivers build config 00:02:18.297 net/mana: not in enabled drivers build config 00:02:18.297 net/memif: not in enabled drivers build config 00:02:18.297 net/mlx4: not in enabled drivers build config 00:02:18.297 net/mlx5: not in enabled drivers build config 00:02:18.297 net/mvneta: not in enabled drivers build config 00:02:18.297 net/mvpp2: not in enabled drivers build config 00:02:18.297 net/netvsc: not in enabled drivers build config 00:02:18.297 net/nfb: not in enabled drivers build config 00:02:18.297 net/nfp: not in enabled drivers build config 00:02:18.297 net/ngbe: not in enabled drivers build config 00:02:18.297 net/null: not in enabled drivers build config 00:02:18.297 net/octeontx: not in enabled drivers build config 00:02:18.297 net/octeon_ep: not in enabled drivers build config 00:02:18.297 net/pcap: not in enabled drivers build config 00:02:18.297 net/pfe: not in enabled drivers build config 00:02:18.297 net/qede: not in enabled drivers build config 00:02:18.297 net/ring: not in enabled drivers build config 00:02:18.297 net/sfc: not in enabled drivers build config 00:02:18.297 net/softnic: not in enabled drivers build config 00:02:18.297 net/tap: not in enabled drivers build config 00:02:18.297 net/thunderx: not in enabled drivers build config 00:02:18.297 net/txgbe: not in enabled drivers build config 00:02:18.297 net/vdev_netvsc: not in enabled drivers build config 00:02:18.297 net/vhost: not in enabled drivers build config 00:02:18.297 net/virtio: not in enabled drivers build config 00:02:18.297 net/vmxnet3: not in enabled drivers build config 00:02:18.297 raw/*: missing internal dependency, "rawdev" 00:02:18.297 crypto/armv8: not in enabled drivers build config 00:02:18.297 crypto/bcmfs: not in enabled drivers build config 00:02:18.297 crypto/caam_jr: not in enabled drivers build config 00:02:18.297 crypto/ccp: not in enabled drivers build config 00:02:18.297 crypto/cnxk: not in enabled drivers build config 00:02:18.297 crypto/dpaa_sec: not in enabled drivers build config 00:02:18.297 crypto/dpaa2_sec: not in enabled drivers build config 00:02:18.297 crypto/mvsam: not in enabled drivers build config 00:02:18.297 crypto/nitrox: not in enabled drivers build config 00:02:18.297 crypto/null: not in enabled drivers build config 00:02:18.297 crypto/octeontx: not in enabled drivers build config 00:02:18.297 crypto/openssl: not in enabled drivers build config 00:02:18.297 crypto/scheduler: not in enabled drivers build config 00:02:18.297 crypto/uadk: not in enabled drivers build config 00:02:18.297 crypto/virtio: not in enabled drivers build config 00:02:18.297 compress/nitrox: not in enabled drivers build config 00:02:18.297 compress/octeontx: not in enabled drivers build config 00:02:18.297 compress/zlib: not in enabled drivers build config 00:02:18.297 regex/*: missing internal dependency, "regexdev" 00:02:18.298 ml/*: missing internal dependency, "mldev" 00:02:18.298 vdpa/ifc: not in enabled drivers build config 00:02:18.298 vdpa/mlx5: not in enabled drivers build config 00:02:18.298 vdpa/nfp: not in enabled drivers build config 00:02:18.298 vdpa/sfc: not in enabled drivers build config 00:02:18.298 event/*: missing internal dependency, "eventdev" 00:02:18.298 baseband/*: missing internal dependency, "bbdev" 00:02:18.298 gpu/*: missing internal dependency, "gpudev" 00:02:18.298 00:02:18.298 00:02:18.869 Build targets in project: 114 00:02:18.869 00:02:18.869 DPDK 24.03.0 00:02:18.869 00:02:18.869 User defined options 00:02:18.869 buildtype : debug 00:02:18.869 default_library : shared 00:02:18.869 libdir : lib 00:02:18.869 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:18.869 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:18.869 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:18.869 cpu_instruction_set: native 00:02:18.869 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:02:18.869 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,argparse,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:02:18.869 enable_docs : false 00:02:18.869 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:18.869 enable_kmods : false 00:02:18.869 tests : false 00:02:18.869 00:02:18.869 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:19.456 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:19.456 [1/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:19.456 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:19.456 [3/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:19.456 [4/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:19.733 [5/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:19.733 [6/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:19.733 [7/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:19.733 [8/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:19.733 [9/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:19.733 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:19.733 [11/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:19.733 [12/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:19.733 [13/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:19.733 [14/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:19.733 [15/377] Linking static target lib/librte_kvargs.a 00:02:19.733 [16/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:19.733 [17/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:19.733 [18/377] Linking static target lib/librte_log.a 00:02:19.733 [19/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:19.733 [20/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:20.010 [21/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:20.010 [22/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:20.010 [23/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:20.010 [24/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:20.010 [25/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:20.010 [26/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:20.010 [27/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:20.276 [28/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:20.276 [29/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:20.276 [30/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:20.276 [31/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:20.276 [32/377] Linking static target lib/librte_pci.a 00:02:20.276 [33/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:20.276 [34/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:20.276 [35/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:20.276 [36/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:20.276 [37/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:20.541 [38/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:20.541 [39/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:20.541 [40/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:20.541 [41/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:20.541 [42/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:20.541 [43/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:20.541 [44/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:20.541 [45/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:20.541 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:20.541 [47/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:20.541 [48/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:20.541 [49/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:20.541 [50/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:20.541 [51/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:20.541 [52/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:20.541 [53/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:20.541 [54/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:20.541 [55/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:02:20.541 [56/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:20.541 [57/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:20.541 [58/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:20.541 [59/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:20.541 [60/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:20.541 [61/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:20.541 [62/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:20.541 [63/377] Linking static target lib/librte_ring.a 00:02:20.541 [64/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:20.541 [65/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:20.541 [66/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:20.541 [67/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:20.541 [68/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:20.804 [69/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:20.804 [70/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:20.804 [71/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:20.804 [72/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:20.804 [73/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:20.804 [74/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:20.804 [75/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:20.804 [76/377] Linking static target lib/librte_rcu.a 00:02:20.804 [77/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:20.804 [78/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:20.804 [79/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:20.804 [80/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:20.804 [81/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:20.804 [82/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:20.804 [83/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:20.804 [84/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:20.804 [85/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:20.804 [86/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:21.064 [87/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:21.064 [88/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:21.064 [89/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:21.064 [90/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:21.064 [91/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:21.064 [92/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:21.064 [93/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:21.064 [94/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:21.064 [95/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:21.064 [96/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:21.064 [97/377] Linking static target lib/librte_dmadev.a 00:02:21.064 [98/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:21.064 [99/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:21.064 [100/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:21.064 [101/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:21.064 [102/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:21.064 [103/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:21.064 [104/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:21.064 [105/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:21.064 [106/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:21.064 [107/377] Linking static target lib/librte_telemetry.a 00:02:21.064 [108/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:21.064 [109/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:21.064 [110/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:21.064 [111/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:21.064 [112/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:21.064 [113/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:21.064 [114/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:21.064 [115/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:21.064 [116/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:21.064 [117/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.064 [118/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:21.064 [119/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:21.064 [120/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:21.064 [121/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:21.064 [122/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:21.064 [123/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:21.064 [124/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:21.064 [125/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:21.064 [126/377] Linking static target lib/librte_meter.a 00:02:21.064 [127/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:21.064 [128/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:21.064 [129/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:21.064 [130/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:21.064 [131/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:21.064 [132/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.064 [133/377] Linking static target lib/librte_mempool.a 00:02:21.064 [134/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:21.064 [135/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:21.064 [136/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:21.064 [137/377] Linking static target lib/librte_timer.a 00:02:21.322 [138/377] Linking static target lib/librte_compressdev.a 00:02:21.322 [139/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:21.322 [140/377] Linking static target lib/librte_mbuf.a 00:02:21.322 [141/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:21.322 [142/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:21.322 [143/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:21.322 [144/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:21.323 [145/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:21.323 [146/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:21.323 [147/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:21.323 [148/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:21.323 [149/377] Linking static target lib/librte_net.a 00:02:21.323 [150/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:21.323 [151/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:21.323 [152/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:21.323 [153/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:21.323 [154/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:21.323 [155/377] Linking static target lib/librte_cmdline.a 00:02:21.323 [156/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:21.323 [157/377] Linking static target lib/librte_security.a 00:02:21.323 [158/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:21.323 [159/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:21.323 [160/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:21.323 [161/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:21.323 [162/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.323 [163/377] Linking static target lib/librte_power.a 00:02:21.323 [164/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:21.323 [165/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:21.323 [166/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:21.323 [167/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:21.323 [168/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:21.323 [169/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.323 [170/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:21.323 [171/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:21.323 [172/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:21.323 [173/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:21.323 [174/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:21.323 [175/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:21.323 [176/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:21.323 [177/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:21.323 [178/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:21.582 [179/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:21.582 [180/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [181/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:21.582 [182/377] Linking static target lib/librte_reorder.a 00:02:21.582 [183/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [184/377] Linking static target lib/librte_eal.a 00:02:21.582 [185/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:21.582 [186/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:21.582 [187/377] Linking target lib/librte_log.so.24.1 00:02:21.582 [188/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:21.582 [189/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:21.582 [190/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [191/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:21.582 [192/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:21.582 [193/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:21.582 [194/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:21.582 [195/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:21.582 [196/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:21.582 [197/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:21.582 [198/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:21.582 [199/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:21.582 [200/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [201/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:21.582 [202/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:21.582 [203/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:21.582 [204/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:21.582 [205/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:21.582 [206/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:21.582 [207/377] Linking static target drivers/librte_bus_auxiliary.a 00:02:21.582 [208/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:21.582 [209/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:21.582 [210/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [211/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:21.582 [212/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:21.582 [213/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:21.582 [214/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:21.582 [215/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:21.582 [216/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:21.582 [217/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:21.582 [218/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:21.582 [219/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.582 [220/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:21.582 [221/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:21.582 [222/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:21.582 [223/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:21.582 [224/377] Linking target lib/librte_kvargs.so.24.1 00:02:21.582 [225/377] Linking static target drivers/librte_bus_vdev.a 00:02:21.582 [226/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:21.582 [227/377] Linking target lib/librte_telemetry.so.24.1 00:02:21.582 [228/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:21.582 [229/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:21.582 [230/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:21.582 [231/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:21.843 [232/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:21.843 [233/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:21.843 [234/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:21.843 [235/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:21.843 [236/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:21.843 [237/377] Linking static target lib/librte_hash.a 00:02:21.843 [238/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:21.843 [239/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:21.843 [240/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:21.843 [241/377] Linking static target lib/librte_cryptodev.a 00:02:21.843 [242/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:21.843 [243/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:21.843 [244/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:21.843 [245/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:21.843 [246/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:21.843 [247/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:21.843 [248/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:21.843 [249/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:21.843 [250/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:21.843 [251/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:21.843 [252/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:21.843 [253/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:21.843 [254/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:21.843 [255/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:21.843 [256/377] Linking static target drivers/librte_bus_pci.a 00:02:21.844 [257/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:21.844 [258/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.844 [259/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:21.844 [260/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:21.844 [261/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:21.844 [262/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:21.844 [263/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.844 [264/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:21.844 [265/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:21.844 [266/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.844 [267/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.844 [268/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:21.844 [269/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:21.844 [270/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:21.844 [271/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:21.844 [272/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.844 [273/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:21.844 [274/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:21.844 [275/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:21.844 [276/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:21.844 [277/377] Linking static target drivers/librte_mempool_ring.a 00:02:21.844 [278/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:21.844 [279/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:21.844 [280/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:21.844 [281/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:21.844 [282/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:22.104 [283/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.104 [284/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:22.104 [285/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:22.104 [286/377] Linking static target drivers/librte_compress_mlx5.a 00:02:22.104 [287/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:22.104 [288/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.104 [289/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:22.104 [290/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:22.104 [291/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:22.104 [292/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:22.104 [293/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:22.104 [294/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:22.104 [295/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:22.104 [296/377] Linking static target lib/librte_ethdev.a 00:02:22.104 [297/377] Linking static target drivers/librte_compress_isal.a 00:02:22.104 [298/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:22.104 [299/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:22.105 [300/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:22.105 [301/377] Linking static target drivers/librte_crypto_mlx5.a 00:02:22.105 [302/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.364 [303/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:22.364 [304/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:22.364 [305/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:22.364 [306/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:22.364 [307/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:22.364 [308/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:22.364 [309/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:22.364 [310/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.624 [311/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:22.624 [312/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.624 [313/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:22.625 [314/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:22.625 [315/377] Linking static target drivers/librte_common_mlx5.a 00:02:22.625 [316/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.625 [317/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:22.885 [318/377] Linking static target drivers/libtmp_rte_common_qat.a 00:02:22.885 [319/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:23.146 [320/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:23.146 [321/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:23.146 [322/377] Linking static target drivers/librte_common_qat.a 00:02:23.146 [323/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:23.146 [324/377] Linking static target lib/librte_vhost.a 00:02:24.089 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.475 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.778 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.078 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.464 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.464 [330/377] Linking target lib/librte_eal.so.24.1 00:02:33.464 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:33.464 [332/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:33.464 [333/377] Linking target lib/librte_ring.so.24.1 00:02:33.464 [334/377] Linking target lib/librte_pci.so.24.1 00:02:33.464 [335/377] Linking target lib/librte_timer.so.24.1 00:02:33.464 [336/377] Linking target lib/librte_meter.so.24.1 00:02:33.464 [337/377] Linking target lib/librte_dmadev.so.24.1 00:02:33.464 [338/377] Linking target drivers/librte_bus_vdev.so.24.1 00:02:33.725 [339/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:33.725 [340/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:33.725 [341/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:33.725 [342/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:33.725 [343/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:33.725 [344/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:33.725 [345/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:33.725 [346/377] Linking target drivers/librte_bus_pci.so.24.1 00:02:33.725 [347/377] Linking target lib/librte_rcu.so.24.1 00:02:33.725 [348/377] Linking target lib/librte_mempool.so.24.1 00:02:33.986 [349/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:33.986 [350/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:33.986 [351/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:33.986 [352/377] Linking target lib/librte_mbuf.so.24.1 00:02:33.986 [353/377] Linking target drivers/librte_mempool_ring.so.24.1 00:02:34.247 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:34.247 [355/377] Linking target lib/librte_net.so.24.1 00:02:34.247 [356/377] Linking target lib/librte_reorder.so.24.1 00:02:34.247 [357/377] Linking target lib/librte_compressdev.so.24.1 00:02:34.247 [358/377] Linking target lib/librte_cryptodev.so.24.1 00:02:34.247 [359/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:34.247 [360/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:34.247 [361/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:34.507 [362/377] Linking target lib/librte_hash.so.24.1 00:02:34.507 [363/377] Linking target lib/librte_cmdline.so.24.1 00:02:34.507 [364/377] Linking target lib/librte_ethdev.so.24.1 00:02:34.507 [365/377] Linking target lib/librte_security.so.24.1 00:02:34.507 [366/377] Linking target drivers/librte_compress_isal.so.24.1 00:02:34.507 [367/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:34.507 [368/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:34.507 [369/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:34.507 [370/377] Linking target drivers/librte_common_mlx5.so.24.1 00:02:34.507 [371/377] Linking target lib/librte_power.so.24.1 00:02:34.768 [372/377] Linking target lib/librte_vhost.so.24.1 00:02:34.768 [373/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:34.768 [374/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:34.768 [375/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:34.768 [376/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:34.768 [377/377] Linking target drivers/librte_common_qat.so.24.1 00:02:34.768 INFO: autodetecting backend as ninja 00:02:34.768 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 128 00:02:36.150 CC lib/log/log.o 00:02:36.150 CC lib/log/log_flags.o 00:02:36.150 CC lib/log/log_deprecated.o 00:02:36.150 CC lib/ut_mock/mock.o 00:02:36.150 CC lib/ut/ut.o 00:02:36.150 LIB libspdk_log.a 00:02:36.150 LIB libspdk_ut.a 00:02:36.150 LIB libspdk_ut_mock.a 00:02:36.410 SO libspdk_log.so.7.0 00:02:36.410 SO libspdk_ut_mock.so.6.0 00:02:36.410 SO libspdk_ut.so.2.0 00:02:36.410 SYMLINK libspdk_log.so 00:02:36.410 SYMLINK libspdk_ut_mock.so 00:02:36.410 SYMLINK libspdk_ut.so 00:02:36.672 CC lib/dma/dma.o 00:02:36.672 CC lib/ioat/ioat.o 00:02:36.672 CXX lib/trace_parser/trace.o 00:02:36.672 CC lib/util/base64.o 00:02:36.672 CC lib/util/bit_array.o 00:02:36.672 CC lib/util/cpuset.o 00:02:36.672 CC lib/util/crc16.o 00:02:36.672 CC lib/util/crc32.o 00:02:36.672 CC lib/util/crc32c.o 00:02:36.672 CC lib/util/crc32_ieee.o 00:02:36.672 CC lib/util/crc64.o 00:02:36.672 CC lib/util/dif.o 00:02:36.672 CC lib/util/fd.o 00:02:36.672 CC lib/util/file.o 00:02:36.672 CC lib/util/hexlify.o 00:02:36.672 CC lib/util/iov.o 00:02:36.672 CC lib/util/math.o 00:02:36.672 CC lib/util/pipe.o 00:02:36.672 CC lib/util/strerror_tls.o 00:02:36.672 CC lib/util/string.o 00:02:36.672 CC lib/util/uuid.o 00:02:36.672 CC lib/util/fd_group.o 00:02:36.672 CC lib/util/xor.o 00:02:36.672 CC lib/util/zipf.o 00:02:36.932 CC lib/vfio_user/host/vfio_user_pci.o 00:02:36.932 CC lib/vfio_user/host/vfio_user.o 00:02:36.932 LIB libspdk_dma.a 00:02:36.932 SO libspdk_dma.so.4.0 00:02:36.932 LIB libspdk_ioat.a 00:02:36.932 SYMLINK libspdk_dma.so 00:02:37.193 SO libspdk_ioat.so.7.0 00:02:37.193 SYMLINK libspdk_ioat.so 00:02:37.193 LIB libspdk_vfio_user.a 00:02:37.193 SO libspdk_vfio_user.so.5.0 00:02:37.193 LIB libspdk_util.a 00:02:37.193 SYMLINK libspdk_vfio_user.so 00:02:37.193 SO libspdk_util.so.9.1 00:02:37.453 SYMLINK libspdk_util.so 00:02:37.453 LIB libspdk_trace_parser.a 00:02:37.453 SO libspdk_trace_parser.so.5.0 00:02:37.714 SYMLINK libspdk_trace_parser.so 00:02:37.714 CC lib/reduce/reduce.o 00:02:37.714 CC lib/idxd/idxd.o 00:02:37.714 CC lib/json/json_parse.o 00:02:37.714 CC lib/idxd/idxd_user.o 00:02:37.714 CC lib/json/json_util.o 00:02:37.714 CC lib/idxd/idxd_kernel.o 00:02:37.714 CC lib/json/json_write.o 00:02:37.714 CC lib/rdma/common.o 00:02:37.714 CC lib/rdma/rdma_verbs.o 00:02:37.714 CC lib/env_dpdk/env.o 00:02:37.714 CC lib/env_dpdk/memory.o 00:02:37.714 CC lib/conf/conf.o 00:02:37.714 CC lib/vmd/vmd.o 00:02:37.714 CC lib/env_dpdk/pci.o 00:02:37.714 CC lib/vmd/led.o 00:02:37.714 CC lib/env_dpdk/init.o 00:02:37.714 CC lib/env_dpdk/threads.o 00:02:37.714 CC lib/env_dpdk/pci_ioat.o 00:02:37.714 CC lib/env_dpdk/pci_virtio.o 00:02:37.714 CC lib/env_dpdk/pci_vmd.o 00:02:37.714 CC lib/env_dpdk/pci_idxd.o 00:02:37.714 CC lib/env_dpdk/pci_event.o 00:02:37.714 CC lib/env_dpdk/sigbus_handler.o 00:02:37.714 CC lib/env_dpdk/pci_dpdk.o 00:02:37.714 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:37.714 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:37.975 LIB libspdk_conf.a 00:02:38.236 LIB libspdk_rdma.a 00:02:38.236 LIB libspdk_json.a 00:02:38.236 SO libspdk_conf.so.6.0 00:02:38.236 SO libspdk_json.so.6.0 00:02:38.236 SO libspdk_rdma.so.6.0 00:02:38.236 SYMLINK libspdk_conf.so 00:02:38.236 SYMLINK libspdk_json.so 00:02:38.236 SYMLINK libspdk_rdma.so 00:02:38.236 LIB libspdk_idxd.a 00:02:38.498 SO libspdk_idxd.so.12.0 00:02:38.498 LIB libspdk_reduce.a 00:02:38.498 LIB libspdk_vmd.a 00:02:38.498 SYMLINK libspdk_idxd.so 00:02:38.498 SO libspdk_reduce.so.6.0 00:02:38.498 SO libspdk_vmd.so.6.0 00:02:38.498 SYMLINK libspdk_reduce.so 00:02:38.498 SYMLINK libspdk_vmd.so 00:02:38.498 CC lib/jsonrpc/jsonrpc_server.o 00:02:38.498 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:38.498 CC lib/jsonrpc/jsonrpc_client.o 00:02:38.498 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:38.759 LIB libspdk_jsonrpc.a 00:02:39.020 SO libspdk_jsonrpc.so.6.0 00:02:39.020 SYMLINK libspdk_jsonrpc.so 00:02:39.020 LIB libspdk_env_dpdk.a 00:02:39.020 SO libspdk_env_dpdk.so.14.1 00:02:39.281 SYMLINK libspdk_env_dpdk.so 00:02:39.281 CC lib/rpc/rpc.o 00:02:39.542 LIB libspdk_rpc.a 00:02:39.542 SO libspdk_rpc.so.6.0 00:02:39.542 SYMLINK libspdk_rpc.so 00:02:40.114 CC lib/keyring/keyring.o 00:02:40.114 CC lib/notify/notify.o 00:02:40.114 CC lib/keyring/keyring_rpc.o 00:02:40.114 CC lib/notify/notify_rpc.o 00:02:40.114 CC lib/trace/trace.o 00:02:40.114 CC lib/trace/trace_flags.o 00:02:40.114 CC lib/trace/trace_rpc.o 00:02:40.114 LIB libspdk_notify.a 00:02:40.114 SO libspdk_notify.so.6.0 00:02:40.114 LIB libspdk_keyring.a 00:02:40.385 LIB libspdk_trace.a 00:02:40.385 SO libspdk_keyring.so.1.0 00:02:40.385 SYMLINK libspdk_notify.so 00:02:40.385 SO libspdk_trace.so.10.0 00:02:40.385 SYMLINK libspdk_keyring.so 00:02:40.385 SYMLINK libspdk_trace.so 00:02:40.650 CC lib/sock/sock.o 00:02:40.650 CC lib/sock/sock_rpc.o 00:02:40.650 CC lib/thread/thread.o 00:02:40.650 CC lib/thread/iobuf.o 00:02:41.258 LIB libspdk_sock.a 00:02:41.258 SO libspdk_sock.so.10.0 00:02:41.258 SYMLINK libspdk_sock.so 00:02:41.542 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:41.542 CC lib/nvme/nvme_ctrlr.o 00:02:41.542 CC lib/nvme/nvme_fabric.o 00:02:41.542 CC lib/nvme/nvme_ns_cmd.o 00:02:41.542 CC lib/nvme/nvme_ns.o 00:02:41.542 CC lib/nvme/nvme_pcie_common.o 00:02:41.542 CC lib/nvme/nvme_pcie.o 00:02:41.542 CC lib/nvme/nvme_qpair.o 00:02:41.542 CC lib/nvme/nvme.o 00:02:41.542 CC lib/nvme/nvme_quirks.o 00:02:41.542 CC lib/nvme/nvme_transport.o 00:02:41.542 CC lib/nvme/nvme_discovery.o 00:02:41.542 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:41.543 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:41.543 CC lib/nvme/nvme_tcp.o 00:02:41.543 CC lib/nvme/nvme_opal.o 00:02:41.543 CC lib/nvme/nvme_io_msg.o 00:02:41.543 CC lib/nvme/nvme_poll_group.o 00:02:41.543 CC lib/nvme/nvme_zns.o 00:02:41.543 CC lib/nvme/nvme_stubs.o 00:02:41.543 CC lib/nvme/nvme_auth.o 00:02:41.543 CC lib/nvme/nvme_cuse.o 00:02:41.543 CC lib/nvme/nvme_rdma.o 00:02:42.114 LIB libspdk_thread.a 00:02:42.114 SO libspdk_thread.so.10.1 00:02:42.114 SYMLINK libspdk_thread.so 00:02:42.375 CC lib/blob/blobstore.o 00:02:42.375 CC lib/blob/request.o 00:02:42.375 CC lib/blob/zeroes.o 00:02:42.375 CC lib/blob/blob_bs_dev.o 00:02:42.375 CC lib/init/json_config.o 00:02:42.375 CC lib/init/subsystem.o 00:02:42.375 CC lib/init/subsystem_rpc.o 00:02:42.375 CC lib/init/rpc.o 00:02:42.375 CC lib/accel/accel.o 00:02:42.375 CC lib/accel/accel_rpc.o 00:02:42.375 CC lib/accel/accel_sw.o 00:02:42.375 CC lib/virtio/virtio.o 00:02:42.375 CC lib/virtio/virtio_vhost_user.o 00:02:42.375 CC lib/virtio/virtio_vfio_user.o 00:02:42.375 CC lib/virtio/virtio_pci.o 00:02:42.637 LIB libspdk_init.a 00:02:42.637 SO libspdk_init.so.5.0 00:02:42.637 LIB libspdk_virtio.a 00:02:42.897 SYMLINK libspdk_init.so 00:02:42.897 SO libspdk_virtio.so.7.0 00:02:42.897 SYMLINK libspdk_virtio.so 00:02:43.159 CC lib/event/app.o 00:02:43.159 CC lib/event/reactor.o 00:02:43.159 CC lib/event/log_rpc.o 00:02:43.159 CC lib/event/app_rpc.o 00:02:43.159 CC lib/event/scheduler_static.o 00:02:43.420 LIB libspdk_nvme.a 00:02:43.420 LIB libspdk_accel.a 00:02:43.420 SO libspdk_accel.so.15.0 00:02:43.420 SO libspdk_nvme.so.13.0 00:02:43.420 SYMLINK libspdk_accel.so 00:02:43.420 LIB libspdk_event.a 00:02:43.681 SO libspdk_event.so.13.1 00:02:43.681 SYMLINK libspdk_event.so 00:02:43.681 SYMLINK libspdk_nvme.so 00:02:43.681 CC lib/bdev/bdev.o 00:02:43.681 CC lib/bdev/bdev_rpc.o 00:02:43.681 CC lib/bdev/bdev_zone.o 00:02:43.681 CC lib/bdev/part.o 00:02:43.681 CC lib/bdev/scsi_nvme.o 00:02:45.068 LIB libspdk_blob.a 00:02:45.068 SO libspdk_blob.so.11.0 00:02:45.068 SYMLINK libspdk_blob.so 00:02:45.329 CC lib/blobfs/blobfs.o 00:02:45.329 CC lib/blobfs/tree.o 00:02:45.329 CC lib/lvol/lvol.o 00:02:45.900 LIB libspdk_bdev.a 00:02:45.900 SO libspdk_bdev.so.15.0 00:02:45.900 LIB libspdk_blobfs.a 00:02:46.161 SO libspdk_blobfs.so.10.0 00:02:46.161 SYMLINK libspdk_bdev.so 00:02:46.161 LIB libspdk_lvol.a 00:02:46.161 SYMLINK libspdk_blobfs.so 00:02:46.161 SO libspdk_lvol.so.10.0 00:02:46.161 SYMLINK libspdk_lvol.so 00:02:46.426 CC lib/nvmf/ctrlr.o 00:02:46.426 CC lib/nvmf/ctrlr_discovery.o 00:02:46.426 CC lib/nvmf/ctrlr_bdev.o 00:02:46.426 CC lib/nvmf/subsystem.o 00:02:46.426 CC lib/nvmf/nvmf.o 00:02:46.426 CC lib/nvmf/nvmf_rpc.o 00:02:46.426 CC lib/nvmf/transport.o 00:02:46.426 CC lib/nvmf/tcp.o 00:02:46.426 CC lib/nbd/nbd.o 00:02:46.426 CC lib/nvmf/stubs.o 00:02:46.426 CC lib/nvmf/mdns_server.o 00:02:46.426 CC lib/nbd/nbd_rpc.o 00:02:46.426 CC lib/ftl/ftl_core.o 00:02:46.426 CC lib/ublk/ublk.o 00:02:46.426 CC lib/nvmf/rdma.o 00:02:46.426 CC lib/ftl/ftl_init.o 00:02:46.426 CC lib/nvmf/auth.o 00:02:46.426 CC lib/scsi/dev.o 00:02:46.426 CC lib/ublk/ublk_rpc.o 00:02:46.426 CC lib/scsi/lun.o 00:02:46.426 CC lib/ftl/ftl_layout.o 00:02:46.426 CC lib/scsi/port.o 00:02:46.426 CC lib/ftl/ftl_debug.o 00:02:46.426 CC lib/ftl/ftl_io.o 00:02:46.426 CC lib/scsi/scsi.o 00:02:46.426 CC lib/scsi/scsi_bdev.o 00:02:46.426 CC lib/ftl/ftl_sb.o 00:02:46.426 CC lib/ftl/ftl_l2p.o 00:02:46.426 CC lib/scsi/scsi_pr.o 00:02:46.426 CC lib/scsi/scsi_rpc.o 00:02:46.426 CC lib/ftl/ftl_l2p_flat.o 00:02:46.426 CC lib/scsi/task.o 00:02:46.426 CC lib/ftl/ftl_nv_cache.o 00:02:46.426 CC lib/ftl/ftl_band.o 00:02:46.426 CC lib/ftl/ftl_band_ops.o 00:02:46.426 CC lib/ftl/ftl_writer.o 00:02:46.426 CC lib/ftl/ftl_rq.o 00:02:46.426 CC lib/ftl/ftl_reloc.o 00:02:46.426 CC lib/ftl/ftl_p2l.o 00:02:46.426 CC lib/ftl/ftl_l2p_cache.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:46.426 CC lib/ftl/utils/ftl_conf.o 00:02:46.426 CC lib/ftl/utils/ftl_md.o 00:02:46.426 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:46.426 CC lib/ftl/utils/ftl_mempool.o 00:02:46.426 CC lib/ftl/utils/ftl_bitmap.o 00:02:46.426 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:46.426 CC lib/ftl/utils/ftl_property.o 00:02:46.426 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:46.426 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:46.426 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:46.426 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:46.426 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:46.426 CC lib/ftl/base/ftl_base_bdev.o 00:02:46.426 CC lib/ftl/ftl_trace.o 00:02:46.426 CC lib/ftl/base/ftl_base_dev.o 00:02:46.999 LIB libspdk_nbd.a 00:02:47.260 SO libspdk_nbd.so.7.0 00:02:47.260 SYMLINK libspdk_nbd.so 00:02:47.260 LIB libspdk_ublk.a 00:02:47.260 LIB libspdk_scsi.a 00:02:47.260 SO libspdk_ublk.so.3.0 00:02:47.260 SO libspdk_scsi.so.9.0 00:02:47.260 SYMLINK libspdk_ublk.so 00:02:47.521 SYMLINK libspdk_scsi.so 00:02:47.521 LIB libspdk_ftl.a 00:02:47.783 SO libspdk_ftl.so.9.0 00:02:47.783 CC lib/vhost/vhost.o 00:02:47.783 CC lib/vhost/vhost_rpc.o 00:02:47.783 CC lib/vhost/vhost_scsi.o 00:02:47.783 CC lib/vhost/vhost_blk.o 00:02:47.783 CC lib/vhost/rte_vhost_user.o 00:02:47.783 CC lib/iscsi/conn.o 00:02:47.783 CC lib/iscsi/init_grp.o 00:02:47.783 CC lib/iscsi/iscsi.o 00:02:47.783 CC lib/iscsi/md5.o 00:02:47.783 CC lib/iscsi/param.o 00:02:47.783 CC lib/iscsi/portal_grp.o 00:02:47.783 CC lib/iscsi/tgt_node.o 00:02:47.783 CC lib/iscsi/iscsi_subsystem.o 00:02:47.783 CC lib/iscsi/iscsi_rpc.o 00:02:47.783 CC lib/iscsi/task.o 00:02:48.045 SYMLINK libspdk_ftl.so 00:02:48.306 LIB libspdk_nvmf.a 00:02:48.568 SO libspdk_nvmf.so.19.0 00:02:48.568 LIB libspdk_vhost.a 00:02:48.568 SYMLINK libspdk_nvmf.so 00:02:48.829 SO libspdk_vhost.so.8.0 00:02:48.829 SYMLINK libspdk_vhost.so 00:02:48.829 LIB libspdk_iscsi.a 00:02:48.829 SO libspdk_iscsi.so.8.0 00:02:49.090 SYMLINK libspdk_iscsi.so 00:02:49.661 CC module/env_dpdk/env_dpdk_rpc.o 00:02:49.922 LIB libspdk_env_dpdk_rpc.a 00:02:49.922 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:49.922 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:49.922 CC module/blob/bdev/blob_bdev.o 00:02:49.922 CC module/sock/posix/posix.o 00:02:49.923 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:49.923 CC module/keyring/linux/keyring.o 00:02:49.923 CC module/accel/dsa/accel_dsa.o 00:02:49.923 CC module/keyring/linux/keyring_rpc.o 00:02:49.923 CC module/accel/dsa/accel_dsa_rpc.o 00:02:49.923 CC module/accel/error/accel_error.o 00:02:49.923 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:49.923 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:49.923 CC module/keyring/file/keyring.o 00:02:49.923 CC module/accel/error/accel_error_rpc.o 00:02:49.923 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:49.923 CC module/accel/ioat/accel_ioat.o 00:02:49.923 CC module/keyring/file/keyring_rpc.o 00:02:49.923 CC module/accel/iaa/accel_iaa.o 00:02:49.923 CC module/accel/ioat/accel_ioat_rpc.o 00:02:49.923 CC module/accel/iaa/accel_iaa_rpc.o 00:02:49.923 SO libspdk_env_dpdk_rpc.so.6.0 00:02:49.923 CC module/scheduler/gscheduler/gscheduler.o 00:02:49.923 SYMLINK libspdk_env_dpdk_rpc.so 00:02:49.923 LIB libspdk_keyring_linux.a 00:02:49.923 LIB libspdk_keyring_file.a 00:02:50.183 LIB libspdk_scheduler_dpdk_governor.a 00:02:50.183 LIB libspdk_scheduler_gscheduler.a 00:02:50.183 LIB libspdk_accel_ioat.a 00:02:50.183 SO libspdk_keyring_file.so.1.0 00:02:50.183 SO libspdk_keyring_linux.so.1.0 00:02:50.183 SO libspdk_scheduler_gscheduler.so.4.0 00:02:50.183 LIB libspdk_accel_error.a 00:02:50.183 LIB libspdk_scheduler_dynamic.a 00:02:50.183 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:50.183 LIB libspdk_accel_iaa.a 00:02:50.183 LIB libspdk_blob_bdev.a 00:02:50.183 SO libspdk_accel_error.so.2.0 00:02:50.183 SO libspdk_accel_ioat.so.6.0 00:02:50.183 SO libspdk_scheduler_dynamic.so.4.0 00:02:50.183 LIB libspdk_accel_dsa.a 00:02:50.183 SO libspdk_accel_iaa.so.3.0 00:02:50.183 SYMLINK libspdk_scheduler_gscheduler.so 00:02:50.183 SYMLINK libspdk_keyring_file.so 00:02:50.183 SO libspdk_blob_bdev.so.11.0 00:02:50.183 SYMLINK libspdk_keyring_linux.so 00:02:50.183 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:50.183 SO libspdk_accel_dsa.so.5.0 00:02:50.183 SYMLINK libspdk_accel_error.so 00:02:50.183 SYMLINK libspdk_accel_ioat.so 00:02:50.183 SYMLINK libspdk_scheduler_dynamic.so 00:02:50.183 SYMLINK libspdk_accel_iaa.so 00:02:50.183 SYMLINK libspdk_blob_bdev.so 00:02:50.183 SYMLINK libspdk_accel_dsa.so 00:02:50.447 LIB libspdk_sock_posix.a 00:02:50.709 SO libspdk_sock_posix.so.6.0 00:02:50.709 SYMLINK libspdk_sock_posix.so 00:02:50.709 CC module/bdev/error/vbdev_error.o 00:02:50.709 CC module/bdev/delay/vbdev_delay.o 00:02:50.709 CC module/bdev/error/vbdev_error_rpc.o 00:02:50.709 CC module/bdev/lvol/vbdev_lvol.o 00:02:50.709 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:50.709 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:50.709 CC module/blobfs/bdev/blobfs_bdev.o 00:02:50.709 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:50.709 CC module/bdev/gpt/gpt.o 00:02:50.709 CC module/bdev/gpt/vbdev_gpt.o 00:02:50.709 CC module/bdev/nvme/bdev_nvme.o 00:02:50.709 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:50.709 CC module/bdev/nvme/nvme_rpc.o 00:02:50.709 CC module/bdev/malloc/bdev_malloc.o 00:02:50.709 CC module/bdev/aio/bdev_aio.o 00:02:50.709 CC module/bdev/null/bdev_null.o 00:02:50.709 CC module/bdev/nvme/bdev_mdns_client.o 00:02:50.709 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:50.709 CC module/bdev/raid/bdev_raid.o 00:02:50.709 CC module/bdev/compress/vbdev_compress.o 00:02:50.709 CC module/bdev/split/vbdev_split.o 00:02:50.709 CC module/bdev/raid/bdev_raid_rpc.o 00:02:50.709 CC module/bdev/null/bdev_null_rpc.o 00:02:50.709 CC module/bdev/aio/bdev_aio_rpc.o 00:02:50.709 CC module/bdev/iscsi/bdev_iscsi.o 00:02:50.709 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:50.709 CC module/bdev/nvme/vbdev_opal.o 00:02:50.709 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:50.709 CC module/bdev/split/vbdev_split_rpc.o 00:02:50.709 CC module/bdev/raid/bdev_raid_sb.o 00:02:50.709 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:50.709 CC module/bdev/ftl/bdev_ftl.o 00:02:50.709 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:50.709 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:50.709 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:50.709 CC module/bdev/raid/raid0.o 00:02:50.709 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:50.709 CC module/bdev/raid/concat.o 00:02:50.709 CC module/bdev/raid/raid1.o 00:02:50.709 CC module/bdev/crypto/vbdev_crypto.o 00:02:50.709 CC module/bdev/passthru/vbdev_passthru.o 00:02:50.709 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:50.710 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:50.710 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:50.710 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:50.710 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:50.969 LIB libspdk_accel_dpdk_compressdev.a 00:02:50.969 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:50.969 LIB libspdk_blobfs_bdev.a 00:02:50.969 LIB libspdk_accel_dpdk_cryptodev.a 00:02:51.229 SO libspdk_blobfs_bdev.so.6.0 00:02:51.229 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:51.229 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:51.229 LIB libspdk_bdev_split.a 00:02:51.229 LIB libspdk_bdev_gpt.a 00:02:51.229 LIB libspdk_bdev_null.a 00:02:51.229 SO libspdk_bdev_split.so.6.0 00:02:51.229 SYMLINK libspdk_blobfs_bdev.so 00:02:51.229 LIB libspdk_bdev_error.a 00:02:51.229 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:51.229 SO libspdk_bdev_gpt.so.6.0 00:02:51.229 LIB libspdk_bdev_ftl.a 00:02:51.229 SO libspdk_bdev_null.so.6.0 00:02:51.229 LIB libspdk_bdev_delay.a 00:02:51.229 LIB libspdk_bdev_aio.a 00:02:51.229 LIB libspdk_bdev_zone_block.a 00:02:51.229 SO libspdk_bdev_error.so.6.0 00:02:51.229 SYMLINK libspdk_bdev_split.so 00:02:51.229 SO libspdk_bdev_ftl.so.6.0 00:02:51.229 LIB libspdk_bdev_iscsi.a 00:02:51.229 LIB libspdk_bdev_malloc.a 00:02:51.229 SO libspdk_bdev_delay.so.6.0 00:02:51.229 LIB libspdk_bdev_compress.a 00:02:51.229 LIB libspdk_bdev_passthru.a 00:02:51.229 SO libspdk_bdev_zone_block.so.6.0 00:02:51.229 SO libspdk_bdev_aio.so.6.0 00:02:51.229 SYMLINK libspdk_bdev_gpt.so 00:02:51.229 SYMLINK libspdk_bdev_null.so 00:02:51.229 SO libspdk_bdev_iscsi.so.6.0 00:02:51.229 SO libspdk_bdev_malloc.so.6.0 00:02:51.229 SYMLINK libspdk_bdev_error.so 00:02:51.229 LIB libspdk_bdev_crypto.a 00:02:51.229 SO libspdk_bdev_passthru.so.6.0 00:02:51.229 SO libspdk_bdev_compress.so.6.0 00:02:51.229 SYMLINK libspdk_bdev_ftl.so 00:02:51.229 SYMLINK libspdk_bdev_delay.so 00:02:51.490 SYMLINK libspdk_bdev_aio.so 00:02:51.490 SO libspdk_bdev_crypto.so.6.0 00:02:51.490 SYMLINK libspdk_bdev_zone_block.so 00:02:51.490 LIB libspdk_bdev_lvol.a 00:02:51.490 SYMLINK libspdk_bdev_iscsi.so 00:02:51.490 SYMLINK libspdk_bdev_malloc.so 00:02:51.490 SYMLINK libspdk_bdev_compress.so 00:02:51.490 SYMLINK libspdk_bdev_passthru.so 00:02:51.490 SO libspdk_bdev_lvol.so.6.0 00:02:51.490 SYMLINK libspdk_bdev_crypto.so 00:02:51.490 LIB libspdk_bdev_virtio.a 00:02:51.490 SO libspdk_bdev_virtio.so.6.0 00:02:51.490 SYMLINK libspdk_bdev_lvol.so 00:02:51.490 SYMLINK libspdk_bdev_virtio.so 00:02:51.750 LIB libspdk_bdev_raid.a 00:02:51.750 SO libspdk_bdev_raid.so.6.0 00:02:52.011 SYMLINK libspdk_bdev_raid.so 00:02:52.953 LIB libspdk_bdev_nvme.a 00:02:52.953 SO libspdk_bdev_nvme.so.7.0 00:02:52.953 SYMLINK libspdk_bdev_nvme.so 00:02:53.524 CC module/event/subsystems/keyring/keyring.o 00:02:53.524 CC module/event/subsystems/iobuf/iobuf.o 00:02:53.524 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:53.524 CC module/event/subsystems/vmd/vmd.o 00:02:53.524 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:53.524 CC module/event/subsystems/sock/sock.o 00:02:53.524 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:53.524 CC module/event/subsystems/scheduler/scheduler.o 00:02:53.784 LIB libspdk_event_vhost_blk.a 00:02:53.784 LIB libspdk_event_vmd.a 00:02:53.784 LIB libspdk_event_keyring.a 00:02:53.784 LIB libspdk_event_scheduler.a 00:02:53.784 LIB libspdk_event_sock.a 00:02:53.784 LIB libspdk_event_iobuf.a 00:02:53.784 SO libspdk_event_vmd.so.6.0 00:02:53.784 SO libspdk_event_vhost_blk.so.3.0 00:02:53.784 SO libspdk_event_keyring.so.1.0 00:02:53.784 SO libspdk_event_sock.so.5.0 00:02:53.784 SO libspdk_event_scheduler.so.4.0 00:02:53.784 SO libspdk_event_iobuf.so.3.0 00:02:53.784 SYMLINK libspdk_event_vmd.so 00:02:53.784 SYMLINK libspdk_event_vhost_blk.so 00:02:53.784 SYMLINK libspdk_event_keyring.so 00:02:53.784 SYMLINK libspdk_event_sock.so 00:02:53.784 SYMLINK libspdk_event_scheduler.so 00:02:54.045 SYMLINK libspdk_event_iobuf.so 00:02:54.305 CC module/event/subsystems/accel/accel.o 00:02:54.305 LIB libspdk_event_accel.a 00:02:54.566 SO libspdk_event_accel.so.6.0 00:02:54.566 SYMLINK libspdk_event_accel.so 00:02:54.827 CC module/event/subsystems/bdev/bdev.o 00:02:55.087 LIB libspdk_event_bdev.a 00:02:55.087 SO libspdk_event_bdev.so.6.0 00:02:55.087 SYMLINK libspdk_event_bdev.so 00:02:55.660 CC module/event/subsystems/scsi/scsi.o 00:02:55.660 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:55.660 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:55.660 CC module/event/subsystems/nbd/nbd.o 00:02:55.660 CC module/event/subsystems/ublk/ublk.o 00:02:55.660 LIB libspdk_event_scsi.a 00:02:55.660 LIB libspdk_event_ublk.a 00:02:55.660 LIB libspdk_event_nbd.a 00:02:55.660 SO libspdk_event_scsi.so.6.0 00:02:55.660 SO libspdk_event_nbd.so.6.0 00:02:55.660 SO libspdk_event_ublk.so.3.0 00:02:55.660 LIB libspdk_event_nvmf.a 00:02:55.922 SYMLINK libspdk_event_scsi.so 00:02:55.922 SO libspdk_event_nvmf.so.6.0 00:02:55.922 SYMLINK libspdk_event_nbd.so 00:02:55.922 SYMLINK libspdk_event_ublk.so 00:02:55.922 SYMLINK libspdk_event_nvmf.so 00:02:56.183 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:56.183 CC module/event/subsystems/iscsi/iscsi.o 00:02:56.444 LIB libspdk_event_vhost_scsi.a 00:02:56.445 LIB libspdk_event_iscsi.a 00:02:56.445 SO libspdk_event_vhost_scsi.so.3.0 00:02:56.445 SO libspdk_event_iscsi.so.6.0 00:02:56.445 SYMLINK libspdk_event_vhost_scsi.so 00:02:56.445 SYMLINK libspdk_event_iscsi.so 00:02:56.706 SO libspdk.so.6.0 00:02:56.706 SYMLINK libspdk.so 00:02:56.969 CC app/spdk_nvme_perf/perf.o 00:02:56.969 CC app/spdk_lspci/spdk_lspci.o 00:02:56.969 CXX app/trace/trace.o 00:02:56.969 CC test/rpc_client/rpc_client_test.o 00:02:56.969 CC app/spdk_nvme_discover/discovery_aer.o 00:02:57.231 CC app/spdk_dd/spdk_dd.o 00:02:57.231 CC app/spdk_top/spdk_top.o 00:02:57.231 CC app/spdk_nvme_identify/identify.o 00:02:57.231 TEST_HEADER include/spdk/accel.h 00:02:57.231 TEST_HEADER include/spdk/assert.h 00:02:57.231 TEST_HEADER include/spdk/barrier.h 00:02:57.231 TEST_HEADER include/spdk/accel_module.h 00:02:57.231 TEST_HEADER include/spdk/bdev_zone.h 00:02:57.232 TEST_HEADER include/spdk/base64.h 00:02:57.232 TEST_HEADER include/spdk/bdev_module.h 00:02:57.232 CC app/vhost/vhost.o 00:02:57.232 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:57.232 TEST_HEADER include/spdk/bit_pool.h 00:02:57.232 TEST_HEADER include/spdk/bdev.h 00:02:57.232 CC app/iscsi_tgt/iscsi_tgt.o 00:02:57.232 TEST_HEADER include/spdk/blob_bdev.h 00:02:57.232 TEST_HEADER include/spdk/bit_array.h 00:02:57.232 CC app/nvmf_tgt/nvmf_main.o 00:02:57.232 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:57.232 TEST_HEADER include/spdk/blobfs.h 00:02:57.232 TEST_HEADER include/spdk/conf.h 00:02:57.232 TEST_HEADER include/spdk/blob.h 00:02:57.232 CC app/trace_record/trace_record.o 00:02:57.232 TEST_HEADER include/spdk/config.h 00:02:57.232 TEST_HEADER include/spdk/cpuset.h 00:02:57.232 TEST_HEADER include/spdk/crc32.h 00:02:57.232 TEST_HEADER include/spdk/crc16.h 00:02:57.232 TEST_HEADER include/spdk/crc64.h 00:02:57.232 TEST_HEADER include/spdk/endian.h 00:02:57.232 TEST_HEADER include/spdk/dma.h 00:02:57.232 TEST_HEADER include/spdk/dif.h 00:02:57.232 CC app/spdk_tgt/spdk_tgt.o 00:02:57.232 TEST_HEADER include/spdk/env_dpdk.h 00:02:57.232 TEST_HEADER include/spdk/event.h 00:02:57.232 TEST_HEADER include/spdk/env.h 00:02:57.232 TEST_HEADER include/spdk/fd_group.h 00:02:57.232 TEST_HEADER include/spdk/file.h 00:02:57.232 TEST_HEADER include/spdk/ftl.h 00:02:57.232 TEST_HEADER include/spdk/fd.h 00:02:57.232 TEST_HEADER include/spdk/gpt_spec.h 00:02:57.232 TEST_HEADER include/spdk/hexlify.h 00:02:57.232 TEST_HEADER include/spdk/histogram_data.h 00:02:57.232 TEST_HEADER include/spdk/idxd_spec.h 00:02:57.232 TEST_HEADER include/spdk/idxd.h 00:02:57.232 TEST_HEADER include/spdk/init.h 00:02:57.232 TEST_HEADER include/spdk/ioat.h 00:02:57.232 TEST_HEADER include/spdk/ioat_spec.h 00:02:57.232 TEST_HEADER include/spdk/iscsi_spec.h 00:02:57.232 TEST_HEADER include/spdk/keyring.h 00:02:57.232 TEST_HEADER include/spdk/jsonrpc.h 00:02:57.232 TEST_HEADER include/spdk/keyring_module.h 00:02:57.232 TEST_HEADER include/spdk/json.h 00:02:57.232 TEST_HEADER include/spdk/likely.h 00:02:57.232 TEST_HEADER include/spdk/lvol.h 00:02:57.232 TEST_HEADER include/spdk/log.h 00:02:57.232 TEST_HEADER include/spdk/memory.h 00:02:57.232 TEST_HEADER include/spdk/nbd.h 00:02:57.232 TEST_HEADER include/spdk/mmio.h 00:02:57.232 TEST_HEADER include/spdk/notify.h 00:02:57.232 TEST_HEADER include/spdk/nvme.h 00:02:57.232 TEST_HEADER include/spdk/nvme_intel.h 00:02:57.232 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:57.232 TEST_HEADER include/spdk/nvme_spec.h 00:02:57.232 TEST_HEADER include/spdk/nvme_zns.h 00:02:57.232 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:57.232 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:57.232 TEST_HEADER include/spdk/nvmf.h 00:02:57.232 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:57.232 TEST_HEADER include/spdk/nvmf_spec.h 00:02:57.232 TEST_HEADER include/spdk/opal.h 00:02:57.232 TEST_HEADER include/spdk/nvmf_transport.h 00:02:57.232 TEST_HEADER include/spdk/pipe.h 00:02:57.232 TEST_HEADER include/spdk/pci_ids.h 00:02:57.232 TEST_HEADER include/spdk/opal_spec.h 00:02:57.232 TEST_HEADER include/spdk/reduce.h 00:02:57.232 TEST_HEADER include/spdk/rpc.h 00:02:57.232 TEST_HEADER include/spdk/scsi.h 00:02:57.232 TEST_HEADER include/spdk/scheduler.h 00:02:57.232 TEST_HEADER include/spdk/scsi_spec.h 00:02:57.232 TEST_HEADER include/spdk/queue.h 00:02:57.232 TEST_HEADER include/spdk/sock.h 00:02:57.232 TEST_HEADER include/spdk/stdinc.h 00:02:57.232 TEST_HEADER include/spdk/string.h 00:02:57.232 TEST_HEADER include/spdk/trace.h 00:02:57.232 TEST_HEADER include/spdk/thread.h 00:02:57.232 TEST_HEADER include/spdk/trace_parser.h 00:02:57.232 TEST_HEADER include/spdk/tree.h 00:02:57.232 TEST_HEADER include/spdk/ublk.h 00:02:57.232 TEST_HEADER include/spdk/util.h 00:02:57.232 TEST_HEADER include/spdk/uuid.h 00:02:57.232 TEST_HEADER include/spdk/version.h 00:02:57.232 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:57.232 TEST_HEADER include/spdk/vhost.h 00:02:57.232 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:57.232 TEST_HEADER include/spdk/vmd.h 00:02:57.232 TEST_HEADER include/spdk/xor.h 00:02:57.232 TEST_HEADER include/spdk/zipf.h 00:02:57.232 CXX test/cpp_headers/assert.o 00:02:57.232 CXX test/cpp_headers/accel.o 00:02:57.232 CXX test/cpp_headers/accel_module.o 00:02:57.232 CXX test/cpp_headers/barrier.o 00:02:57.232 CXX test/cpp_headers/bdev.o 00:02:57.232 CXX test/cpp_headers/base64.o 00:02:57.232 CC examples/util/zipf/zipf.o 00:02:57.232 CXX test/cpp_headers/bdev_module.o 00:02:57.232 CXX test/cpp_headers/bit_pool.o 00:02:57.232 CXX test/cpp_headers/bit_array.o 00:02:57.232 CXX test/cpp_headers/blobfs_bdev.o 00:02:57.232 CXX test/cpp_headers/bdev_zone.o 00:02:57.232 CXX test/cpp_headers/blob_bdev.o 00:02:57.232 CXX test/cpp_headers/blob.o 00:02:57.232 CXX test/cpp_headers/blobfs.o 00:02:57.232 CXX test/cpp_headers/conf.o 00:02:57.522 CXX test/cpp_headers/config.o 00:02:57.522 CXX test/cpp_headers/cpuset.o 00:02:57.522 CXX test/cpp_headers/crc16.o 00:02:57.522 CXX test/cpp_headers/dif.o 00:02:57.522 CXX test/cpp_headers/crc32.o 00:02:57.522 CXX test/cpp_headers/crc64.o 00:02:57.522 CXX test/cpp_headers/endian.o 00:02:57.522 CXX test/cpp_headers/dma.o 00:02:57.522 CC examples/sock/hello_world/hello_sock.o 00:02:57.522 CXX test/cpp_headers/env_dpdk.o 00:02:57.522 CXX test/cpp_headers/env.o 00:02:57.522 CXX test/cpp_headers/event.o 00:02:57.522 CXX test/cpp_headers/gpt_spec.o 00:02:57.522 CXX test/cpp_headers/fd.o 00:02:57.522 CXX test/cpp_headers/ftl.o 00:02:57.522 CXX test/cpp_headers/fd_group.o 00:02:57.522 CXX test/cpp_headers/hexlify.o 00:02:57.522 CXX test/cpp_headers/histogram_data.o 00:02:57.522 CXX test/cpp_headers/idxd.o 00:02:57.522 CXX test/cpp_headers/file.o 00:02:57.522 CXX test/cpp_headers/ioat_spec.o 00:02:57.522 CXX test/cpp_headers/iscsi_spec.o 00:02:57.522 CXX test/cpp_headers/idxd_spec.o 00:02:57.522 CC test/env/memory/memory_ut.o 00:02:57.522 CXX test/cpp_headers/init.o 00:02:57.522 CXX test/cpp_headers/ioat.o 00:02:57.522 CXX test/cpp_headers/jsonrpc.o 00:02:57.522 CXX test/cpp_headers/json.o 00:02:57.522 CXX test/cpp_headers/keyring.o 00:02:57.522 CXX test/cpp_headers/keyring_module.o 00:02:57.522 CC examples/nvme/hello_world/hello_world.o 00:02:57.522 CXX test/cpp_headers/likely.o 00:02:57.522 CC test/app/stub/stub.o 00:02:57.522 CC test/thread/poller_perf/poller_perf.o 00:02:57.522 CXX test/cpp_headers/log.o 00:02:57.522 CXX test/cpp_headers/memory.o 00:02:57.522 CC examples/vmd/lsvmd/lsvmd.o 00:02:57.522 CC examples/accel/perf/accel_perf.o 00:02:57.522 CC test/nvme/reserve/reserve.o 00:02:57.522 CC test/nvme/aer/aer.o 00:02:57.522 CXX test/cpp_headers/mmio.o 00:02:57.522 CXX test/cpp_headers/nbd.o 00:02:57.522 CXX test/cpp_headers/lvol.o 00:02:57.522 CC test/nvme/fdp/fdp.o 00:02:57.522 CC examples/ioat/perf/perf.o 00:02:57.522 CC test/nvme/reset/reset.o 00:02:57.522 CXX test/cpp_headers/notify.o 00:02:57.522 CC test/event/reactor/reactor.o 00:02:57.522 CXX test/cpp_headers/nvme.o 00:02:57.522 CC test/nvme/startup/startup.o 00:02:57.522 CC examples/nvme/reconnect/reconnect.o 00:02:57.522 CC examples/vmd/led/led.o 00:02:57.522 CC examples/ioat/verify/verify.o 00:02:57.522 CC test/env/pci/pci_ut.o 00:02:57.522 CC test/app/jsoncat/jsoncat.o 00:02:57.522 CC test/app/histogram_perf/histogram_perf.o 00:02:57.522 CC test/nvme/fused_ordering/fused_ordering.o 00:02:57.522 CC app/fio/nvme/fio_plugin.o 00:02:57.522 CC examples/nvme/abort/abort.o 00:02:57.522 CC examples/blob/hello_world/hello_blob.o 00:02:57.522 CC test/nvme/e2edp/nvme_dp.o 00:02:57.522 CC test/nvme/sgl/sgl.o 00:02:57.522 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:57.522 CC test/bdev/bdevio/bdevio.o 00:02:57.522 CC test/event/reactor_perf/reactor_perf.o 00:02:57.522 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:57.522 CXX test/cpp_headers/nvme_intel.o 00:02:57.522 CC examples/nvme/hotplug/hotplug.o 00:02:57.522 CC test/nvme/overhead/overhead.o 00:02:57.522 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:57.522 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:57.522 CC test/event/event_perf/event_perf.o 00:02:57.522 CC examples/blob/cli/blobcli.o 00:02:57.522 CC examples/nvmf/nvmf/nvmf.o 00:02:57.522 CC test/nvme/connect_stress/connect_stress.o 00:02:57.522 LINK rpc_client_test 00:02:57.522 CC test/nvme/boot_partition/boot_partition.o 00:02:57.522 CC test/event/app_repeat/app_repeat.o 00:02:57.522 CXX test/cpp_headers/nvme_ocssd.o 00:02:57.522 CC examples/idxd/perf/perf.o 00:02:57.522 CC test/env/vtophys/vtophys.o 00:02:57.803 CC test/nvme/cuse/cuse.o 00:02:57.803 CC examples/bdev/hello_world/hello_bdev.o 00:02:57.803 CC test/nvme/err_injection/err_injection.o 00:02:57.803 CC test/app/bdev_svc/bdev_svc.o 00:02:57.803 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:57.803 LINK interrupt_tgt 00:02:57.803 CC test/nvme/compliance/nvme_compliance.o 00:02:57.803 CC examples/nvme/arbitration/arbitration.o 00:02:57.803 CC app/fio/bdev/fio_plugin.o 00:02:57.803 LINK vhost 00:02:57.803 CC test/accel/dif/dif.o 00:02:57.803 CC test/nvme/simple_copy/simple_copy.o 00:02:57.803 CC test/blobfs/mkfs/mkfs.o 00:02:57.803 CC examples/bdev/bdevperf/bdevperf.o 00:02:57.803 CC test/event/scheduler/scheduler.o 00:02:57.803 LINK iscsi_tgt 00:02:57.803 CC test/dma/test_dma/test_dma.o 00:02:57.803 CC examples/thread/thread/thread_ex.o 00:02:58.079 LINK spdk_lspci 00:02:58.079 LINK spdk_trace_record 00:02:58.079 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:58.079 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:58.358 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:58.358 LINK spdk_tgt 00:02:58.358 LINK spdk_nvme_discover 00:02:58.358 LINK lsvmd 00:02:58.358 LINK reactor_perf 00:02:58.358 LINK spdk_dd 00:02:58.358 LINK stub 00:02:58.358 LINK poller_perf 00:02:58.358 CC test/lvol/esnap/esnap.o 00:02:58.358 LINK nvmf_tgt 00:02:58.627 LINK doorbell_aers 00:02:58.627 CC test/env/mem_callbacks/mem_callbacks.o 00:02:58.627 LINK hello_sock 00:02:58.627 LINK app_repeat 00:02:58.627 LINK fused_ordering 00:02:58.627 LINK connect_stress 00:02:58.627 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:58.627 LINK pmr_persistence 00:02:58.627 CXX test/cpp_headers/nvme_spec.o 00:02:58.627 LINK reserve 00:02:58.627 CXX test/cpp_headers/nvme_zns.o 00:02:58.627 LINK led 00:02:58.627 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:58.627 CXX test/cpp_headers/nvmf_cmd.o 00:02:58.627 LINK reactor 00:02:58.627 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:58.627 CXX test/cpp_headers/nvmf.o 00:02:58.627 LINK zipf 00:02:58.627 CXX test/cpp_headers/nvmf_spec.o 00:02:58.627 CXX test/cpp_headers/nvmf_transport.o 00:02:58.627 CXX test/cpp_headers/opal.o 00:02:58.627 CXX test/cpp_headers/opal_spec.o 00:02:58.627 CXX test/cpp_headers/pci_ids.o 00:02:58.627 CXX test/cpp_headers/pipe.o 00:02:58.627 CXX test/cpp_headers/queue.o 00:02:58.627 CXX test/cpp_headers/rpc.o 00:02:58.627 CXX test/cpp_headers/reduce.o 00:02:58.627 LINK nvme_dp 00:02:58.627 CXX test/cpp_headers/scheduler.o 00:02:58.627 CXX test/cpp_headers/scsi.o 00:02:58.627 CXX test/cpp_headers/scsi_spec.o 00:02:58.627 CXX test/cpp_headers/sock.o 00:02:58.627 LINK boot_partition 00:02:58.627 CXX test/cpp_headers/stdinc.o 00:02:58.627 CXX test/cpp_headers/string.o 00:02:58.627 CXX test/cpp_headers/thread.o 00:02:58.627 LINK mkfs 00:02:58.627 LINK jsoncat 00:02:58.627 CXX test/cpp_headers/trace.o 00:02:58.627 CXX test/cpp_headers/tree.o 00:02:58.627 CXX test/cpp_headers/trace_parser.o 00:02:58.627 CXX test/cpp_headers/ublk.o 00:02:58.627 LINK aer 00:02:58.627 LINK hello_bdev 00:02:58.899 CXX test/cpp_headers/util.o 00:02:58.899 LINK histogram_perf 00:02:58.899 CXX test/cpp_headers/uuid.o 00:02:58.899 CXX test/cpp_headers/version.o 00:02:58.899 CXX test/cpp_headers/vfio_user_pci.o 00:02:58.899 CXX test/cpp_headers/vfio_user_spec.o 00:02:58.899 CXX test/cpp_headers/vmd.o 00:02:58.899 LINK startup 00:02:58.899 LINK nvmf 00:02:58.899 CXX test/cpp_headers/vhost.o 00:02:58.899 LINK vtophys 00:02:58.899 CXX test/cpp_headers/xor.o 00:02:58.899 CXX test/cpp_headers/zipf.o 00:02:58.899 LINK reconnect 00:02:58.899 LINK hello_blob 00:02:58.899 LINK event_perf 00:02:58.899 LINK bdev_svc 00:02:58.899 LINK cmb_copy 00:02:58.899 LINK ioat_perf 00:02:58.899 LINK env_dpdk_post_init 00:02:59.164 LINK reset 00:02:59.164 LINK arbitration 00:02:59.164 LINK verify 00:02:59.164 LINK accel_perf 00:02:59.164 LINK hello_world 00:02:59.164 LINK simple_copy 00:02:59.164 LINK hotplug 00:02:59.424 LINK err_injection 00:02:59.424 LINK sgl 00:02:59.424 LINK overhead 00:02:59.424 LINK spdk_bdev 00:02:59.424 LINK spdk_nvme_perf 00:02:59.424 LINK nvme_fuzz 00:02:59.424 LINK scheduler 00:02:59.424 LINK bdevio 00:02:59.424 LINK nvme_compliance 00:02:59.424 LINK fdp 00:02:59.424 LINK thread 00:02:59.424 LINK dif 00:02:59.685 LINK idxd_perf 00:02:59.685 LINK spdk_trace 00:02:59.685 LINK pci_ut 00:02:59.685 LINK spdk_top 00:02:59.685 LINK abort 00:02:59.685 LINK blobcli 00:02:59.685 LINK test_dma 00:02:59.685 LINK nvme_manage 00:02:59.685 LINK spdk_nvme 00:02:59.945 LINK vhost_fuzz 00:02:59.945 LINK mem_callbacks 00:02:59.945 LINK memory_ut 00:02:59.945 LINK spdk_nvme_identify 00:03:00.206 LINK bdevperf 00:03:00.206 LINK cuse 00:03:00.776 LINK iscsi_fuzz 00:03:03.322 LINK esnap 00:03:03.894 00:03:03.894 real 1m18.390s 00:03:03.894 user 13m58.008s 00:03:03.894 sys 8m59.837s 00:03:03.894 09:58:25 make -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:03:03.894 09:58:25 make -- common/autotest_common.sh@10 -- $ set +x 00:03:03.894 ************************************ 00:03:03.894 END TEST make 00:03:03.894 ************************************ 00:03:03.894 09:58:25 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:03.894 09:58:25 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:03.894 09:58:25 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:03.894 09:58:25 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.894 09:58:25 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:03.894 09:58:25 -- pm/common@44 -- $ pid=780200 00:03:03.894 09:58:25 -- pm/common@50 -- $ kill -TERM 780200 00:03:03.894 09:58:25 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.894 09:58:25 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:03.894 09:58:25 -- pm/common@44 -- $ pid=780201 00:03:03.894 09:58:25 -- pm/common@50 -- $ kill -TERM 780201 00:03:03.894 09:58:25 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.894 09:58:25 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:03.894 09:58:25 -- pm/common@44 -- $ pid=780203 00:03:03.894 09:58:25 -- pm/common@50 -- $ kill -TERM 780203 00:03:03.894 09:58:25 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.894 09:58:25 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:03.894 09:58:25 -- pm/common@44 -- $ pid=780225 00:03:03.894 09:58:25 -- pm/common@50 -- $ sudo -E kill -TERM 780225 00:03:03.894 09:58:25 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:03.894 09:58:25 -- nvmf/common.sh@7 -- # uname -s 00:03:03.894 09:58:25 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:03.894 09:58:25 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:03.894 09:58:25 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:03.894 09:58:25 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:03.894 09:58:25 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:03.894 09:58:25 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:03.894 09:58:25 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:03.894 09:58:25 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:03.894 09:58:25 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:03.894 09:58:25 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:03.894 09:58:25 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:03.894 09:58:25 -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:03.894 09:58:25 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:03.894 09:58:25 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:03.895 09:58:25 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:03.895 09:58:25 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:03.895 09:58:25 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:03.895 09:58:25 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:03.895 09:58:25 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:03.895 09:58:25 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:03.895 09:58:25 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:03.895 09:58:25 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:03.895 09:58:25 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:03.895 09:58:25 -- paths/export.sh@5 -- # export PATH 00:03:03.895 09:58:25 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:03.895 09:58:25 -- nvmf/common.sh@47 -- # : 0 00:03:03.895 09:58:25 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:03.895 09:58:25 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:03.895 09:58:25 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:03.895 09:58:25 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:03.895 09:58:25 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:03.895 09:58:25 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:03.895 09:58:25 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:03.895 09:58:25 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:03.895 09:58:25 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:03.895 09:58:25 -- spdk/autotest.sh@32 -- # uname -s 00:03:03.895 09:58:25 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:03.895 09:58:25 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:03.895 09:58:25 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:03.895 09:58:25 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:03.895 09:58:25 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:03.895 09:58:25 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:03.895 09:58:25 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:03.895 09:58:25 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:03.895 09:58:25 -- spdk/autotest.sh@48 -- # udevadm_pid=848606 00:03:03.895 09:58:25 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:03.895 09:58:25 -- pm/common@17 -- # local monitor 00:03:03.895 09:58:25 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:03.895 09:58:25 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.895 09:58:25 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.895 09:58:25 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.895 09:58:25 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:03.895 09:58:25 -- pm/common@21 -- # date +%s 00:03:03.895 09:58:25 -- pm/common@25 -- # sleep 1 00:03:03.895 09:58:25 -- pm/common@21 -- # date +%s 00:03:03.895 09:58:25 -- pm/common@21 -- # date +%s 00:03:03.895 09:58:25 -- pm/common@21 -- # date +%s 00:03:03.895 09:58:25 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718006305 00:03:03.895 09:58:25 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718006305 00:03:03.895 09:58:25 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718006305 00:03:03.895 09:58:25 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718006305 00:03:04.156 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718006305_collect-vmstat.pm.log 00:03:04.156 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718006305_collect-cpu-load.pm.log 00:03:04.156 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718006305_collect-cpu-temp.pm.log 00:03:04.156 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718006305_collect-bmc-pm.bmc.pm.log 00:03:05.205 09:58:26 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:05.205 09:58:26 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:05.205 09:58:26 -- common/autotest_common.sh@723 -- # xtrace_disable 00:03:05.205 09:58:26 -- common/autotest_common.sh@10 -- # set +x 00:03:05.205 09:58:26 -- spdk/autotest.sh@59 -- # create_test_list 00:03:05.205 09:58:26 -- common/autotest_common.sh@747 -- # xtrace_disable 00:03:05.205 09:58:26 -- common/autotest_common.sh@10 -- # set +x 00:03:05.205 09:58:26 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:05.205 09:58:26 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.205 09:58:26 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.205 09:58:26 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:05.205 09:58:26 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.205 09:58:26 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:05.205 09:58:26 -- common/autotest_common.sh@1454 -- # uname 00:03:05.205 09:58:26 -- common/autotest_common.sh@1454 -- # '[' Linux = FreeBSD ']' 00:03:05.205 09:58:26 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:05.205 09:58:26 -- common/autotest_common.sh@1474 -- # uname 00:03:05.205 09:58:26 -- common/autotest_common.sh@1474 -- # [[ Linux = FreeBSD ]] 00:03:05.205 09:58:26 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:05.205 09:58:26 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:05.205 09:58:26 -- spdk/autotest.sh@72 -- # hash lcov 00:03:05.205 09:58:26 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:05.205 09:58:26 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:05.205 --rc lcov_branch_coverage=1 00:03:05.205 --rc lcov_function_coverage=1 00:03:05.205 --rc genhtml_branch_coverage=1 00:03:05.205 --rc genhtml_function_coverage=1 00:03:05.205 --rc genhtml_legend=1 00:03:05.205 --rc geninfo_all_blocks=1 00:03:05.205 ' 00:03:05.205 09:58:26 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:05.205 --rc lcov_branch_coverage=1 00:03:05.205 --rc lcov_function_coverage=1 00:03:05.205 --rc genhtml_branch_coverage=1 00:03:05.205 --rc genhtml_function_coverage=1 00:03:05.205 --rc genhtml_legend=1 00:03:05.205 --rc geninfo_all_blocks=1 00:03:05.205 ' 00:03:05.205 09:58:26 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:05.205 --rc lcov_branch_coverage=1 00:03:05.205 --rc lcov_function_coverage=1 00:03:05.205 --rc genhtml_branch_coverage=1 00:03:05.205 --rc genhtml_function_coverage=1 00:03:05.205 --rc genhtml_legend=1 00:03:05.205 --rc geninfo_all_blocks=1 00:03:05.205 --no-external' 00:03:05.205 09:58:26 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:05.205 --rc lcov_branch_coverage=1 00:03:05.205 --rc lcov_function_coverage=1 00:03:05.205 --rc genhtml_branch_coverage=1 00:03:05.205 --rc genhtml_function_coverage=1 00:03:05.205 --rc genhtml_legend=1 00:03:05.205 --rc geninfo_all_blocks=1 00:03:05.205 --no-external' 00:03:05.205 09:58:26 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:05.205 lcov: LCOV version 1.14 00:03:05.205 09:58:26 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:17.430 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:17.430 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:32.336 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:32.336 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:32.337 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:32.337 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:32.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:32.599 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:32.862 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:32.863 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:32.863 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:32.863 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:32.863 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:34.776 09:58:56 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:34.776 09:58:56 -- common/autotest_common.sh@723 -- # xtrace_disable 00:03:34.776 09:58:56 -- common/autotest_common.sh@10 -- # set +x 00:03:34.776 09:58:56 -- spdk/autotest.sh@91 -- # rm -f 00:03:34.776 09:58:56 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:38.979 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:65:00.0 (8086 0a54): Already using the nvme driver 00:03:38.979 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:03:38.979 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:03:38.979 09:59:00 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:38.979 09:59:00 -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:03:38.979 09:59:00 -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:03:38.979 09:59:00 -- common/autotest_common.sh@1669 -- # local nvme bdf 00:03:38.979 09:59:00 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:38.979 09:59:00 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:03:38.979 09:59:00 -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:03:38.979 09:59:00 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:38.979 09:59:00 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:38.979 09:59:00 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:38.979 09:59:00 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:38.979 09:59:00 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:38.979 09:59:00 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:38.979 09:59:00 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:38.979 09:59:00 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:38.979 No valid GPT data, bailing 00:03:38.979 09:59:00 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:38.979 09:59:00 -- scripts/common.sh@391 -- # pt= 00:03:38.979 09:59:00 -- scripts/common.sh@392 -- # return 1 00:03:38.979 09:59:00 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:38.979 1+0 records in 00:03:38.979 1+0 records out 00:03:38.979 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00606087 s, 173 MB/s 00:03:38.979 09:59:00 -- spdk/autotest.sh@118 -- # sync 00:03:38.979 09:59:00 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:38.979 09:59:00 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:38.979 09:59:00 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:47.118 09:59:08 -- spdk/autotest.sh@124 -- # uname -s 00:03:47.118 09:59:08 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:47.118 09:59:08 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:47.118 09:59:08 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:47.118 09:59:08 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:47.118 09:59:08 -- common/autotest_common.sh@10 -- # set +x 00:03:47.118 ************************************ 00:03:47.118 START TEST setup.sh 00:03:47.118 ************************************ 00:03:47.118 09:59:08 setup.sh -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:47.118 * Looking for test storage... 00:03:47.118 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:47.118 09:59:08 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:47.118 09:59:08 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:47.118 09:59:08 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:47.118 09:59:08 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:47.118 09:59:08 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:47.118 09:59:08 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:47.118 ************************************ 00:03:47.118 START TEST acl 00:03:47.118 ************************************ 00:03:47.118 09:59:08 setup.sh.acl -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:47.118 * Looking for test storage... 00:03:47.118 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1669 -- # local nvme bdf 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:47.119 09:59:08 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:47.119 09:59:08 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:47.119 09:59:08 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.119 09:59:08 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.333 09:59:12 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:51.333 09:59:12 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:51.333 09:59:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.333 09:59:12 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:51.333 09:59:12 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.333 09:59:12 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:55.543 Hugepages 00:03:55.543 node hugesize free / total 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 00:03:55.543 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:55.543 09:59:16 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:55.543 09:59:16 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:55.543 09:59:16 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:55.543 09:59:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:55.543 ************************************ 00:03:55.543 START TEST denied 00:03:55.543 ************************************ 00:03:55.543 09:59:16 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # denied 00:03:55.543 09:59:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:03:55.543 09:59:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:55.543 09:59:16 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:03:55.543 09:59:16 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:55.543 09:59:16 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:59.754 0000:65:00.0 (8086 0a54): Skipping denied controller at 0000:65:00.0 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:59.754 09:59:20 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:05.088 00:04:05.088 real 0m9.281s 00:04:05.088 user 0m3.086s 00:04:05.088 sys 0m5.456s 00:04:05.088 09:59:26 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:05.088 09:59:26 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:05.088 ************************************ 00:04:05.088 END TEST denied 00:04:05.088 ************************************ 00:04:05.088 09:59:26 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:05.088 09:59:26 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:05.088 09:59:26 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:05.088 09:59:26 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:05.088 ************************************ 00:04:05.088 START TEST allowed 00:04:05.088 ************************************ 00:04:05.088 09:59:26 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # allowed 00:04:05.088 09:59:26 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:04:05.088 09:59:26 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:05.088 09:59:26 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:04:05.088 09:59:26 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.088 09:59:26 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:10.376 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:10.376 09:59:32 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:10.376 09:59:32 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:10.376 09:59:32 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:10.376 09:59:32 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:10.376 09:59:32 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:15.661 00:04:15.661 real 0m10.317s 00:04:15.661 user 0m3.125s 00:04:15.661 sys 0m5.407s 00:04:15.661 09:59:36 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:15.661 09:59:36 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:15.661 ************************************ 00:04:15.661 END TEST allowed 00:04:15.661 ************************************ 00:04:15.661 00:04:15.661 real 0m28.227s 00:04:15.661 user 0m9.340s 00:04:15.661 sys 0m16.596s 00:04:15.661 09:59:36 setup.sh.acl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:15.662 09:59:36 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:15.662 ************************************ 00:04:15.662 END TEST acl 00:04:15.662 ************************************ 00:04:15.662 09:59:36 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:15.662 09:59:36 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:15.662 09:59:36 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:15.662 09:59:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:15.662 ************************************ 00:04:15.662 START TEST hugepages 00:04:15.662 ************************************ 00:04:15.662 09:59:36 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:15.662 * Looking for test storage... 00:04:15.662 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 103304224 kB' 'MemAvailable: 106564632 kB' 'Buffers: 3736 kB' 'Cached: 14519464 kB' 'SwapCached: 0 kB' 'Active: 11546612 kB' 'Inactive: 3520652 kB' 'Active(anon): 11127668 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547504 kB' 'Mapped: 177476 kB' 'Shmem: 10583604 kB' 'KReclaimable: 290312 kB' 'Slab: 1025768 kB' 'SReclaimable: 290312 kB' 'SUnreclaim: 735456 kB' 'KernelStack: 25136 kB' 'PageTables: 8852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69463464 kB' 'Committed_AS: 12657768 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230092 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.662 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:15.663 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:15.664 09:59:36 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:15.664 09:59:36 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:15.664 09:59:36 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:15.664 09:59:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:15.664 ************************************ 00:04:15.664 START TEST default_setup 00:04:15.664 ************************************ 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # default_setup 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.664 09:59:36 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:19.046 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:19.046 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:20.981 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105472468 kB' 'MemAvailable: 108732828 kB' 'Buffers: 3736 kB' 'Cached: 14519608 kB' 'SwapCached: 0 kB' 'Active: 11565540 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146596 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565888 kB' 'Mapped: 177332 kB' 'Shmem: 10583748 kB' 'KReclaimable: 290216 kB' 'Slab: 1023484 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733268 kB' 'KernelStack: 24976 kB' 'PageTables: 8668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680816 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230092 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.981 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.982 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105471648 kB' 'MemAvailable: 108732008 kB' 'Buffers: 3736 kB' 'Cached: 14519608 kB' 'SwapCached: 0 kB' 'Active: 11565180 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146236 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565920 kB' 'Mapped: 177408 kB' 'Shmem: 10583748 kB' 'KReclaimable: 290216 kB' 'Slab: 1023504 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733288 kB' 'KernelStack: 24944 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680832 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230060 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.983 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.984 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105472152 kB' 'MemAvailable: 108732512 kB' 'Buffers: 3736 kB' 'Cached: 14519608 kB' 'SwapCached: 0 kB' 'Active: 11565180 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146236 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565920 kB' 'Mapped: 177408 kB' 'Shmem: 10583748 kB' 'KReclaimable: 290216 kB' 'Slab: 1023504 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733288 kB' 'KernelStack: 24944 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680856 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230060 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.985 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.986 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.987 nr_hugepages=1024 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.987 resv_hugepages=0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.987 surplus_hugepages=0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.987 anon_hugepages=0 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105471640 kB' 'MemAvailable: 108732000 kB' 'Buffers: 3736 kB' 'Cached: 14519612 kB' 'SwapCached: 0 kB' 'Active: 11565320 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146376 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566056 kB' 'Mapped: 177408 kB' 'Shmem: 10583752 kB' 'KReclaimable: 290216 kB' 'Slab: 1023504 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733288 kB' 'KernelStack: 24928 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680876 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230076 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.987 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.988 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 58095244 kB' 'MemUsed: 7566756 kB' 'SwapCached: 0 kB' 'Active: 3913924 kB' 'Inactive: 152040 kB' 'Active(anon): 3812412 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3735752 kB' 'Mapped: 38424 kB' 'AnonPages: 333540 kB' 'Shmem: 3482200 kB' 'KernelStack: 12424 kB' 'PageTables: 5024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 437464 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 329084 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.989 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.252 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:21.253 node0=1024 expecting 1024 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:21.253 00:04:21.253 real 0m6.063s 00:04:21.253 user 0m1.610s 00:04:21.253 sys 0m2.661s 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:21.253 09:59:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:21.253 ************************************ 00:04:21.253 END TEST default_setup 00:04:21.253 ************************************ 00:04:21.253 09:59:42 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:21.253 09:59:42 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:21.253 09:59:42 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:21.253 09:59:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:21.253 ************************************ 00:04:21.253 START TEST per_node_1G_alloc 00:04:21.253 ************************************ 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # per_node_1G_alloc 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.253 09:59:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:25.469 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:25.469 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105464588 kB' 'MemAvailable: 108724948 kB' 'Buffers: 3736 kB' 'Cached: 14519764 kB' 'SwapCached: 0 kB' 'Active: 11563948 kB' 'Inactive: 3520652 kB' 'Active(anon): 11145004 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564220 kB' 'Mapped: 176612 kB' 'Shmem: 10583904 kB' 'KReclaimable: 290216 kB' 'Slab: 1023556 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733340 kB' 'KernelStack: 25072 kB' 'PageTables: 8776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666292 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230332 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.469 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105465084 kB' 'MemAvailable: 108725444 kB' 'Buffers: 3736 kB' 'Cached: 14519768 kB' 'SwapCached: 0 kB' 'Active: 11563448 kB' 'Inactive: 3520652 kB' 'Active(anon): 11144504 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563804 kB' 'Mapped: 176584 kB' 'Shmem: 10583908 kB' 'KReclaimable: 290216 kB' 'Slab: 1023536 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733320 kB' 'KernelStack: 24944 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666308 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230236 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.470 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105464320 kB' 'MemAvailable: 108724680 kB' 'Buffers: 3736 kB' 'Cached: 14519780 kB' 'SwapCached: 0 kB' 'Active: 11564432 kB' 'Inactive: 3520652 kB' 'Active(anon): 11145488 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564832 kB' 'Mapped: 176504 kB' 'Shmem: 10583920 kB' 'KReclaimable: 290216 kB' 'Slab: 1023492 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733276 kB' 'KernelStack: 24976 kB' 'PageTables: 8532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666332 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230316 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.471 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:25.472 nr_hugepages=1024 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:25.472 resv_hugepages=0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:25.472 surplus_hugepages=0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:25.472 anon_hugepages=0 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105466628 kB' 'MemAvailable: 108726988 kB' 'Buffers: 3736 kB' 'Cached: 14519808 kB' 'SwapCached: 0 kB' 'Active: 11563528 kB' 'Inactive: 3520652 kB' 'Active(anon): 11144584 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563856 kB' 'Mapped: 176504 kB' 'Shmem: 10583948 kB' 'KReclaimable: 290216 kB' 'Slab: 1023492 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733276 kB' 'KernelStack: 24944 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12667960 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230380 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59137612 kB' 'MemUsed: 6524388 kB' 'SwapCached: 0 kB' 'Active: 3914108 kB' 'Inactive: 152040 kB' 'Active(anon): 3812596 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3735868 kB' 'Mapped: 38056 kB' 'AnonPages: 333424 kB' 'Shmem: 3482316 kB' 'KernelStack: 12360 kB' 'PageTables: 4696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 437284 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 328904 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.472 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682028 kB' 'MemFree: 46331544 kB' 'MemUsed: 14350484 kB' 'SwapCached: 0 kB' 'Active: 7649652 kB' 'Inactive: 3368612 kB' 'Active(anon): 7332220 kB' 'Inactive(anon): 0 kB' 'Active(file): 317432 kB' 'Inactive(file): 3368612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10787696 kB' 'Mapped: 138448 kB' 'AnonPages: 230656 kB' 'Shmem: 7101652 kB' 'KernelStack: 12616 kB' 'PageTables: 3840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 181836 kB' 'Slab: 586208 kB' 'SReclaimable: 181836 kB' 'SUnreclaim: 404372 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.473 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:25.474 node0=512 expecting 512 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:25.474 node1=512 expecting 512 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:25.474 00:04:25.474 real 0m4.130s 00:04:25.474 user 0m1.541s 00:04:25.474 sys 0m2.663s 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:25.474 09:59:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:25.474 ************************************ 00:04:25.474 END TEST per_node_1G_alloc 00:04:25.474 ************************************ 00:04:25.474 09:59:47 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:25.474 09:59:47 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:25.474 09:59:47 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:25.474 09:59:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:25.474 ************************************ 00:04:25.474 START TEST even_2G_alloc 00:04:25.474 ************************************ 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # even_2G_alloc 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.474 09:59:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:29.684 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:29.684 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105454476 kB' 'MemAvailable: 108714836 kB' 'Buffers: 3736 kB' 'Cached: 14519964 kB' 'SwapCached: 0 kB' 'Active: 11565732 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146788 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565552 kB' 'Mapped: 176632 kB' 'Shmem: 10584104 kB' 'KReclaimable: 290216 kB' 'Slab: 1023668 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733452 kB' 'KernelStack: 24944 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666024 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230284 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.684 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.685 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105456012 kB' 'MemAvailable: 108716372 kB' 'Buffers: 3736 kB' 'Cached: 14519964 kB' 'SwapCached: 0 kB' 'Active: 11566084 kB' 'Inactive: 3520652 kB' 'Active(anon): 11147140 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565912 kB' 'Mapped: 176632 kB' 'Shmem: 10584104 kB' 'KReclaimable: 290216 kB' 'Slab: 1023660 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733444 kB' 'KernelStack: 24928 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666040 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230236 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.686 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.687 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105456356 kB' 'MemAvailable: 108716716 kB' 'Buffers: 3736 kB' 'Cached: 14519968 kB' 'SwapCached: 0 kB' 'Active: 11564544 kB' 'Inactive: 3520652 kB' 'Active(anon): 11145600 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564840 kB' 'Mapped: 176532 kB' 'Shmem: 10584108 kB' 'KReclaimable: 290216 kB' 'Slab: 1023652 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733436 kB' 'KernelStack: 24912 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666064 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230236 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.688 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:29.689 nr_hugepages=1024 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:29.689 resv_hugepages=0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:29.689 surplus_hugepages=0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:29.689 anon_hugepages=0 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.689 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105457304 kB' 'MemAvailable: 108717664 kB' 'Buffers: 3736 kB' 'Cached: 14520024 kB' 'SwapCached: 0 kB' 'Active: 11564552 kB' 'Inactive: 3520652 kB' 'Active(anon): 11145608 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564784 kB' 'Mapped: 176532 kB' 'Shmem: 10584164 kB' 'KReclaimable: 290216 kB' 'Slab: 1023652 kB' 'SReclaimable: 290216 kB' 'SUnreclaim: 733436 kB' 'KernelStack: 24912 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12666084 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230236 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.690 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.691 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59135120 kB' 'MemUsed: 6526880 kB' 'SwapCached: 0 kB' 'Active: 3913340 kB' 'Inactive: 152040 kB' 'Active(anon): 3811828 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3736040 kB' 'Mapped: 38072 kB' 'AnonPages: 332568 kB' 'Shmem: 3482488 kB' 'KernelStack: 12344 kB' 'PageTables: 4700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 437028 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 328648 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.692 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682028 kB' 'MemFree: 46322520 kB' 'MemUsed: 14359508 kB' 'SwapCached: 0 kB' 'Active: 7651228 kB' 'Inactive: 3368612 kB' 'Active(anon): 7333796 kB' 'Inactive(anon): 0 kB' 'Active(file): 317432 kB' 'Inactive(file): 3368612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10787740 kB' 'Mapped: 138460 kB' 'AnonPages: 232212 kB' 'Shmem: 7101696 kB' 'KernelStack: 12552 kB' 'PageTables: 3588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 181836 kB' 'Slab: 586624 kB' 'SReclaimable: 181836 kB' 'SUnreclaim: 404788 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.693 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:29.694 node0=512 expecting 512 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:29.694 node1=512 expecting 512 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:29.694 00:04:29.694 real 0m4.159s 00:04:29.694 user 0m1.588s 00:04:29.694 sys 0m2.645s 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:29.694 09:59:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:29.694 ************************************ 00:04:29.694 END TEST even_2G_alloc 00:04:29.694 ************************************ 00:04:29.694 09:59:51 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:29.694 09:59:51 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:29.694 09:59:51 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:29.694 09:59:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:29.694 ************************************ 00:04:29.694 START TEST odd_alloc 00:04:29.694 ************************************ 00:04:29.694 09:59:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # odd_alloc 00:04:29.694 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:29.694 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:29.694 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:29.694 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.695 09:59:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:33.907 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:33.907 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105455472 kB' 'MemAvailable: 108715816 kB' 'Buffers: 3736 kB' 'Cached: 14520132 kB' 'SwapCached: 0 kB' 'Active: 11565684 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146740 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565704 kB' 'Mapped: 176548 kB' 'Shmem: 10584272 kB' 'KReclaimable: 290184 kB' 'Slab: 1023804 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 733620 kB' 'KernelStack: 24928 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511016 kB' 'Committed_AS: 12666840 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230188 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.907 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.908 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105455472 kB' 'MemAvailable: 108715816 kB' 'Buffers: 3736 kB' 'Cached: 14520132 kB' 'SwapCached: 0 kB' 'Active: 11565780 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146836 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565836 kB' 'Mapped: 176540 kB' 'Shmem: 10584272 kB' 'KReclaimable: 290184 kB' 'Slab: 1023804 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 733620 kB' 'KernelStack: 24896 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511016 kB' 'Committed_AS: 12666856 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230156 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.909 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105454972 kB' 'MemAvailable: 108715316 kB' 'Buffers: 3736 kB' 'Cached: 14520132 kB' 'SwapCached: 0 kB' 'Active: 11565752 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146808 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565808 kB' 'Mapped: 176540 kB' 'Shmem: 10584272 kB' 'KReclaimable: 290184 kB' 'Slab: 1023820 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 733636 kB' 'KernelStack: 24896 kB' 'PageTables: 8340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511016 kB' 'Committed_AS: 12666876 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230156 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.910 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.911 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:33.912 nr_hugepages=1025 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:33.912 resv_hugepages=0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:33.912 surplus_hugepages=0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:33.912 anon_hugepages=0 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.912 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105454972 kB' 'MemAvailable: 108715316 kB' 'Buffers: 3736 kB' 'Cached: 14520132 kB' 'SwapCached: 0 kB' 'Active: 11565752 kB' 'Inactive: 3520652 kB' 'Active(anon): 11146808 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565808 kB' 'Mapped: 176540 kB' 'Shmem: 10584272 kB' 'KReclaimable: 290184 kB' 'Slab: 1023820 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 733636 kB' 'KernelStack: 24896 kB' 'PageTables: 8340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511016 kB' 'Committed_AS: 12666896 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230156 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.913 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59123572 kB' 'MemUsed: 6538428 kB' 'SwapCached: 0 kB' 'Active: 3914620 kB' 'Inactive: 152040 kB' 'Active(anon): 3813108 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3736100 kB' 'Mapped: 38096 kB' 'AnonPages: 333804 kB' 'Shmem: 3482548 kB' 'KernelStack: 12328 kB' 'PageTables: 4784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 436956 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 328576 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682028 kB' 'MemFree: 46330152 kB' 'MemUsed: 14351876 kB' 'SwapCached: 0 kB' 'Active: 7651020 kB' 'Inactive: 3368612 kB' 'Active(anon): 7333588 kB' 'Inactive(anon): 0 kB' 'Active(file): 317432 kB' 'Inactive(file): 3368612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10787828 kB' 'Mapped: 138444 kB' 'AnonPages: 231812 kB' 'Shmem: 7101784 kB' 'KernelStack: 12552 kB' 'PageTables: 3544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 181804 kB' 'Slab: 586864 kB' 'SReclaimable: 181804 kB' 'SUnreclaim: 405060 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.916 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:33.917 node0=512 expecting 513 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:33.917 node1=513 expecting 512 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:33.917 00:04:33.917 real 0m4.170s 00:04:33.917 user 0m1.589s 00:04:33.917 sys 0m2.655s 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:33.917 09:59:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:33.917 ************************************ 00:04:33.917 END TEST odd_alloc 00:04:33.917 ************************************ 00:04:33.917 09:59:55 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:33.917 09:59:55 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:33.917 09:59:55 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:33.917 09:59:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:33.917 ************************************ 00:04:33.917 START TEST custom_alloc 00:04:33.917 ************************************ 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # custom_alloc 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.917 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.918 09:59:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:38.128 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:38.128 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 104428504 kB' 'MemAvailable: 107688848 kB' 'Buffers: 3736 kB' 'Cached: 14520312 kB' 'SwapCached: 0 kB' 'Active: 11567460 kB' 'Inactive: 3520652 kB' 'Active(anon): 11148516 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567384 kB' 'Mapped: 176580 kB' 'Shmem: 10584452 kB' 'KReclaimable: 290184 kB' 'Slab: 1024316 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 734132 kB' 'KernelStack: 24912 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987752 kB' 'Committed_AS: 12667788 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230220 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.128 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.129 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 104428668 kB' 'MemAvailable: 107689012 kB' 'Buffers: 3736 kB' 'Cached: 14520312 kB' 'SwapCached: 0 kB' 'Active: 11566708 kB' 'Inactive: 3520652 kB' 'Active(anon): 11147764 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566636 kB' 'Mapped: 176560 kB' 'Shmem: 10584452 kB' 'KReclaimable: 290184 kB' 'Slab: 1024368 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 734184 kB' 'KernelStack: 24896 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987752 kB' 'Committed_AS: 12667804 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230204 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.130 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:38.131 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 104428416 kB' 'MemAvailable: 107688760 kB' 'Buffers: 3736 kB' 'Cached: 14520332 kB' 'SwapCached: 0 kB' 'Active: 11566736 kB' 'Inactive: 3520652 kB' 'Active(anon): 11147792 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566640 kB' 'Mapped: 176560 kB' 'Shmem: 10584472 kB' 'KReclaimable: 290184 kB' 'Slab: 1024368 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 734184 kB' 'KernelStack: 24896 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987752 kB' 'Committed_AS: 12667824 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230220 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.132 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.133 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:38.134 nr_hugepages=1536 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:38.134 resv_hugepages=0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:38.134 surplus_hugepages=0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:38.134 anon_hugepages=0 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 104428416 kB' 'MemAvailable: 107688760 kB' 'Buffers: 3736 kB' 'Cached: 14520352 kB' 'SwapCached: 0 kB' 'Active: 11566644 kB' 'Inactive: 3520652 kB' 'Active(anon): 11147700 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566492 kB' 'Mapped: 176560 kB' 'Shmem: 10584492 kB' 'KReclaimable: 290184 kB' 'Slab: 1024368 kB' 'SReclaimable: 290184 kB' 'SUnreclaim: 734184 kB' 'KernelStack: 24880 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987752 kB' 'Committed_AS: 12667848 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230220 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.134 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59142560 kB' 'MemUsed: 6519440 kB' 'SwapCached: 0 kB' 'Active: 3915704 kB' 'Inactive: 152040 kB' 'Active(anon): 3814192 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3736180 kB' 'Mapped: 38616 kB' 'AnonPages: 334636 kB' 'Shmem: 3482628 kB' 'KernelStack: 12264 kB' 'PageTables: 4548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 437384 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 329004 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.135 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682028 kB' 'MemFree: 45285912 kB' 'MemUsed: 15396116 kB' 'SwapCached: 0 kB' 'Active: 7653324 kB' 'Inactive: 3368612 kB' 'Active(anon): 7335892 kB' 'Inactive(anon): 0 kB' 'Active(file): 317432 kB' 'Inactive(file): 3368612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10787952 kB' 'Mapped: 138456 kB' 'AnonPages: 233956 kB' 'Shmem: 7101908 kB' 'KernelStack: 12680 kB' 'PageTables: 3568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 181804 kB' 'Slab: 586984 kB' 'SReclaimable: 181804 kB' 'SUnreclaim: 405180 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:38.138 node0=512 expecting 512 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:38.138 node1=1024 expecting 1024 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:38.138 00:04:38.138 real 0m3.877s 00:04:38.138 user 0m1.448s 00:04:38.138 sys 0m2.486s 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:38.138 09:59:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:38.138 ************************************ 00:04:38.138 END TEST custom_alloc 00:04:38.138 ************************************ 00:04:38.138 09:59:59 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:38.138 09:59:59 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:38.138 09:59:59 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:38.138 09:59:59 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:38.138 ************************************ 00:04:38.138 START TEST no_shrink_alloc 00:04:38.138 ************************************ 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # no_shrink_alloc 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.138 09:59:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:42.347 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:42.347 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.347 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105345560 kB' 'MemAvailable: 108605888 kB' 'Buffers: 3736 kB' 'Cached: 14520480 kB' 'SwapCached: 0 kB' 'Active: 11575724 kB' 'Inactive: 3520652 kB' 'Active(anon): 11156780 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574932 kB' 'Mapped: 177472 kB' 'Shmem: 10584620 kB' 'KReclaimable: 290152 kB' 'Slab: 1024332 kB' 'SReclaimable: 290152 kB' 'SUnreclaim: 734180 kB' 'KernelStack: 25168 kB' 'PageTables: 9132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12679324 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230432 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.348 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.349 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105349100 kB' 'MemAvailable: 108609428 kB' 'Buffers: 3736 kB' 'Cached: 14520484 kB' 'SwapCached: 0 kB' 'Active: 11575724 kB' 'Inactive: 3520652 kB' 'Active(anon): 11156780 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575440 kB' 'Mapped: 177472 kB' 'Shmem: 10584624 kB' 'KReclaimable: 290152 kB' 'Slab: 1024320 kB' 'SReclaimable: 290152 kB' 'SUnreclaim: 734168 kB' 'KernelStack: 25136 kB' 'PageTables: 9192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680948 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230384 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.350 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.351 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105350040 kB' 'MemAvailable: 108610368 kB' 'Buffers: 3736 kB' 'Cached: 14520500 kB' 'SwapCached: 0 kB' 'Active: 11574640 kB' 'Inactive: 3520652 kB' 'Active(anon): 11155696 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574248 kB' 'Mapped: 177400 kB' 'Shmem: 10584640 kB' 'KReclaimable: 290152 kB' 'Slab: 1024332 kB' 'SReclaimable: 290152 kB' 'SUnreclaim: 734180 kB' 'KernelStack: 25040 kB' 'PageTables: 9100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680604 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230336 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.352 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.353 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:42.354 nr_hugepages=1024 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:42.354 resv_hugepages=0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:42.354 surplus_hugepages=0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:42.354 anon_hugepages=0 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105354596 kB' 'MemAvailable: 108614924 kB' 'Buffers: 3736 kB' 'Cached: 14520520 kB' 'SwapCached: 0 kB' 'Active: 11574912 kB' 'Inactive: 3520652 kB' 'Active(anon): 11155968 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574532 kB' 'Mapped: 177400 kB' 'Shmem: 10584660 kB' 'KReclaimable: 290152 kB' 'Slab: 1024300 kB' 'SReclaimable: 290152 kB' 'SUnreclaim: 734148 kB' 'KernelStack: 25168 kB' 'PageTables: 9192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12680992 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230400 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.354 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.355 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 58050320 kB' 'MemUsed: 7611680 kB' 'SwapCached: 0 kB' 'Active: 3916732 kB' 'Inactive: 152040 kB' 'Active(anon): 3815220 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3736272 kB' 'Mapped: 38796 kB' 'AnonPages: 335644 kB' 'Shmem: 3482720 kB' 'KernelStack: 12344 kB' 'PageTables: 4704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 437280 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 328900 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.356 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.357 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:42.358 node0=1024 expecting 1024 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.358 10:00:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:45.655 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:45.655 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:45.656 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:45.656 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105375396 kB' 'MemAvailable: 108635708 kB' 'Buffers: 3736 kB' 'Cached: 14520632 kB' 'SwapCached: 0 kB' 'Active: 11575604 kB' 'Inactive: 3520652 kB' 'Active(anon): 11156660 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575048 kB' 'Mapped: 177416 kB' 'Shmem: 10584772 kB' 'KReclaimable: 290120 kB' 'Slab: 1023408 kB' 'SReclaimable: 290120 kB' 'SUnreclaim: 733288 kB' 'KernelStack: 25072 kB' 'PageTables: 8800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12681356 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230496 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.927 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105377480 kB' 'MemAvailable: 108637792 kB' 'Buffers: 3736 kB' 'Cached: 14520636 kB' 'SwapCached: 0 kB' 'Active: 11576032 kB' 'Inactive: 3520652 kB' 'Active(anon): 11157088 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575544 kB' 'Mapped: 177416 kB' 'Shmem: 10584776 kB' 'KReclaimable: 290120 kB' 'Slab: 1023392 kB' 'SReclaimable: 290120 kB' 'SUnreclaim: 733272 kB' 'KernelStack: 25088 kB' 'PageTables: 8936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12681380 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230448 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.928 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105377772 kB' 'MemAvailable: 108638084 kB' 'Buffers: 3736 kB' 'Cached: 14520648 kB' 'SwapCached: 0 kB' 'Active: 11575560 kB' 'Inactive: 3520652 kB' 'Active(anon): 11156616 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575044 kB' 'Mapped: 177428 kB' 'Shmem: 10584788 kB' 'KReclaimable: 290120 kB' 'Slab: 1023300 kB' 'SReclaimable: 290120 kB' 'SUnreclaim: 733180 kB' 'KernelStack: 25152 kB' 'PageTables: 9272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12681404 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230448 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.929 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:45.930 nr_hugepages=1024 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:45.930 resv_hugepages=0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:45.930 surplus_hugepages=0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:45.930 anon_hugepages=0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344028 kB' 'MemFree: 105378108 kB' 'MemAvailable: 108638420 kB' 'Buffers: 3736 kB' 'Cached: 14520672 kB' 'SwapCached: 0 kB' 'Active: 11576088 kB' 'Inactive: 3520652 kB' 'Active(anon): 11157144 kB' 'Inactive(anon): 0 kB' 'Active(file): 418944 kB' 'Inactive(file): 3520652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575572 kB' 'Mapped: 177428 kB' 'Shmem: 10584812 kB' 'KReclaimable: 290120 kB' 'Slab: 1023300 kB' 'SReclaimable: 290120 kB' 'SUnreclaim: 733180 kB' 'KernelStack: 25104 kB' 'PageTables: 9212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512040 kB' 'Committed_AS: 12681556 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 230416 kB' 'VmallocChunk: 0 kB' 'Percpu: 103424 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3287332 kB' 'DirectMap2M: 23656448 kB' 'DirectMap1G: 109051904 kB' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:45.930 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 58077076 kB' 'MemUsed: 7584924 kB' 'SwapCached: 0 kB' 'Active: 3918004 kB' 'Inactive: 152040 kB' 'Active(anon): 3816492 kB' 'Inactive(anon): 0 kB' 'Active(file): 101512 kB' 'Inactive(file): 152040 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3736424 kB' 'Mapped: 38824 kB' 'AnonPages: 336700 kB' 'Shmem: 3482872 kB' 'KernelStack: 12472 kB' 'PageTables: 5460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108380 kB' 'Slab: 436760 kB' 'SReclaimable: 108380 kB' 'SUnreclaim: 328380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:45.931 node0=1024 expecting 1024 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:45.931 00:04:45.931 real 0m8.093s 00:04:45.931 user 0m3.107s 00:04:45.931 sys 0m5.126s 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:45.931 10:00:07 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:45.931 ************************************ 00:04:45.931 END TEST no_shrink_alloc 00:04:45.931 ************************************ 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:45.931 10:00:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:45.931 00:04:45.931 real 0m31.144s 00:04:45.931 user 0m11.141s 00:04:45.931 sys 0m18.669s 00:04:45.931 10:00:07 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:45.931 10:00:07 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:45.931 ************************************ 00:04:45.931 END TEST hugepages 00:04:45.931 ************************************ 00:04:45.931 10:00:07 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:45.931 10:00:07 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:45.931 10:00:07 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:45.931 10:00:07 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:46.191 ************************************ 00:04:46.191 START TEST driver 00:04:46.191 ************************************ 00:04:46.191 10:00:07 setup.sh.driver -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:46.191 * Looking for test storage... 00:04:46.191 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:46.191 10:00:07 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:46.191 10:00:07 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:46.191 10:00:07 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:51.476 10:00:13 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:51.476 10:00:13 setup.sh.driver -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:51.476 10:00:13 setup.sh.driver -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:51.476 10:00:13 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:51.476 ************************************ 00:04:51.476 START TEST guess_driver 00:04:51.476 ************************************ 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # guess_driver 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 370 > 0 )) 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:51.476 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:51.476 Looking for driver=vfio-pci 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.476 10:00:13 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:55.685 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.685 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.685 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.686 10:00:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:57.600 10:00:19 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:02.890 00:05:02.890 real 0m11.356s 00:05:02.890 user 0m3.165s 00:05:02.890 sys 0m5.538s 00:05:02.890 10:00:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:02.890 10:00:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:02.890 ************************************ 00:05:02.890 END TEST guess_driver 00:05:02.890 ************************************ 00:05:02.890 00:05:02.890 real 0m16.696s 00:05:02.890 user 0m4.623s 00:05:02.890 sys 0m8.594s 00:05:02.890 10:00:24 setup.sh.driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:02.890 10:00:24 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:02.890 ************************************ 00:05:02.890 END TEST driver 00:05:02.890 ************************************ 00:05:02.890 10:00:24 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:02.890 10:00:24 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:02.890 10:00:24 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:02.890 10:00:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:02.890 ************************************ 00:05:02.890 START TEST devices 00:05:02.890 ************************************ 00:05:02.890 10:00:24 setup.sh.devices -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:02.890 * Looking for test storage... 00:05:02.890 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:02.890 10:00:24 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:02.890 10:00:24 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:02.890 10:00:24 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:02.890 10:00:24 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:07.101 10:00:28 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:07.101 10:00:28 setup.sh.devices -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:05:07.101 10:00:28 setup.sh.devices -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:05:07.101 10:00:28 setup.sh.devices -- common/autotest_common.sh@1669 -- # local nvme bdf 00:05:07.362 10:00:28 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:05:07.362 10:00:28 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:05:07.362 10:00:28 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:05:07.362 10:00:28 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:07.362 10:00:28 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:05:07.362 10:00:28 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:07.362 10:00:28 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:07.362 10:00:28 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:07.363 No valid GPT data, bailing 00:05:07.363 10:00:29 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:07.363 10:00:29 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:07.363 10:00:29 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:07.363 10:00:29 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:07.363 10:00:29 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:07.363 10:00:29 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:07.363 10:00:29 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:07.363 10:00:29 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:07.363 10:00:29 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:07.363 10:00:29 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:07.363 ************************************ 00:05:07.363 START TEST nvme_mount 00:05:07.363 ************************************ 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # nvme_mount 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:07.363 10:00:29 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:08.302 Creating new GPT entries in memory. 00:05:08.302 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:08.302 other utilities. 00:05:08.302 10:00:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:08.302 10:00:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:08.302 10:00:30 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:08.302 10:00:30 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:08.302 10:00:30 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:09.699 Creating new GPT entries in memory. 00:05:09.699 The operation has completed successfully. 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 889269 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.700 10:00:31 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.116 10:00:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:13.391 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:13.391 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:13.652 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:13.652 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:13.652 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:13.652 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.652 10:00:35 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:17.855 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.856 10:00:39 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.158 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.419 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:21.680 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:21.680 00:05:21.680 real 0m14.229s 00:05:21.680 user 0m4.351s 00:05:21.680 sys 0m7.737s 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:21.680 10:00:43 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:21.680 ************************************ 00:05:21.680 END TEST nvme_mount 00:05:21.680 ************************************ 00:05:21.680 10:00:43 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:21.680 10:00:43 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:21.680 10:00:43 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:21.680 10:00:43 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:21.680 ************************************ 00:05:21.680 START TEST dm_mount 00:05:21.680 ************************************ 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # dm_mount 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:21.680 10:00:43 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:22.622 Creating new GPT entries in memory. 00:05:22.622 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:22.622 other utilities. 00:05:22.622 10:00:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:22.622 10:00:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:22.622 10:00:44 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:22.622 10:00:44 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:22.622 10:00:44 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:24.005 Creating new GPT entries in memory. 00:05:24.005 The operation has completed successfully. 00:05:24.005 10:00:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:24.005 10:00:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:24.005 10:00:45 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:24.005 10:00:45 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:24.005 10:00:45 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:24.948 The operation has completed successfully. 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 894481 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.948 10:00:46 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:29.154 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:29.155 10:00:50 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.454 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:32.715 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:32.715 00:05:32.715 real 0m11.113s 00:05:32.715 user 0m2.883s 00:05:32.715 sys 0m5.306s 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:32.715 10:00:54 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:32.715 ************************************ 00:05:32.715 END TEST dm_mount 00:05:32.715 ************************************ 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:32.715 10:00:54 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:32.977 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:32.977 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:32.977 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:32.977 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:32.977 10:00:54 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:32.977 00:05:32.977 real 0m30.244s 00:05:32.977 user 0m8.972s 00:05:32.977 sys 0m16.093s 00:05:32.977 10:00:54 setup.sh.devices -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:32.977 10:00:54 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:32.977 ************************************ 00:05:32.977 END TEST devices 00:05:32.977 ************************************ 00:05:33.239 00:05:33.239 real 1m46.750s 00:05:33.239 user 0m34.240s 00:05:33.239 sys 1m0.253s 00:05:33.239 10:00:54 setup.sh -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:33.239 10:00:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:33.239 ************************************ 00:05:33.239 END TEST setup.sh 00:05:33.239 ************************************ 00:05:33.239 10:00:54 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:37.448 Hugepages 00:05:37.448 node hugesize free / total 00:05:37.448 node0 1048576kB 0 / 0 00:05:37.448 node0 2048kB 1024 / 1024 00:05:37.448 node1 1048576kB 0 / 0 00:05:37.448 node1 2048kB 1024 / 1024 00:05:37.448 00:05:37.448 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:37.448 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:05:37.448 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:05:37.448 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:05:37.448 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:05:37.448 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:05:37.448 10:00:58 -- spdk/autotest.sh@130 -- # uname -s 00:05:37.448 10:00:58 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:37.448 10:00:58 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:37.448 10:00:58 -- common/autotest_common.sh@1530 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:41.659 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:41.659 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:43.692 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:43.692 10:01:05 -- common/autotest_common.sh@1531 -- # sleep 1 00:05:44.264 10:01:06 -- common/autotest_common.sh@1532 -- # bdfs=() 00:05:44.264 10:01:06 -- common/autotest_common.sh@1532 -- # local bdfs 00:05:44.264 10:01:06 -- common/autotest_common.sh@1533 -- # bdfs=($(get_nvme_bdfs)) 00:05:44.264 10:01:06 -- common/autotest_common.sh@1533 -- # get_nvme_bdfs 00:05:44.264 10:01:06 -- common/autotest_common.sh@1512 -- # bdfs=() 00:05:44.264 10:01:06 -- common/autotest_common.sh@1512 -- # local bdfs 00:05:44.264 10:01:06 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:44.264 10:01:06 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:44.264 10:01:06 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:05:44.525 10:01:06 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:05:44.525 10:01:06 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:65:00.0 00:05:44.525 10:01:06 -- common/autotest_common.sh@1535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:48.733 Waiting for block devices as requested 00:05:48.733 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:48.733 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:48.994 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:05:48.994 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:48.994 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:49.255 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:49.255 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:49.255 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:49.515 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:49.515 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:49.515 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:49.515 10:01:11 -- common/autotest_common.sh@1537 -- # for bdf in "${bdfs[@]}" 00:05:49.515 10:01:11 -- common/autotest_common.sh@1538 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:05:49.515 10:01:11 -- common/autotest_common.sh@1501 -- # readlink -f /sys/class/nvme/nvme0 00:05:49.515 10:01:11 -- common/autotest_common.sh@1501 -- # grep 0000:65:00.0/nvme/nvme 00:05:49.515 10:01:11 -- common/autotest_common.sh@1501 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:49.515 10:01:11 -- common/autotest_common.sh@1502 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:05:49.776 10:01:11 -- common/autotest_common.sh@1506 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:49.776 10:01:11 -- common/autotest_common.sh@1506 -- # printf '%s\n' nvme0 00:05:49.776 10:01:11 -- common/autotest_common.sh@1538 -- # nvme_ctrlr=/dev/nvme0 00:05:49.776 10:01:11 -- common/autotest_common.sh@1539 -- # [[ -z /dev/nvme0 ]] 00:05:49.776 10:01:11 -- common/autotest_common.sh@1544 -- # nvme id-ctrl /dev/nvme0 00:05:49.776 10:01:11 -- common/autotest_common.sh@1544 -- # grep oacs 00:05:49.776 10:01:11 -- common/autotest_common.sh@1544 -- # cut -d: -f2 00:05:49.776 10:01:11 -- common/autotest_common.sh@1544 -- # oacs=' 0xe' 00:05:49.776 10:01:11 -- common/autotest_common.sh@1545 -- # oacs_ns_manage=8 00:05:49.776 10:01:11 -- common/autotest_common.sh@1547 -- # [[ 8 -ne 0 ]] 00:05:49.776 10:01:11 -- common/autotest_common.sh@1553 -- # nvme id-ctrl /dev/nvme0 00:05:49.776 10:01:11 -- common/autotest_common.sh@1553 -- # grep unvmcap 00:05:49.776 10:01:11 -- common/autotest_common.sh@1553 -- # cut -d: -f2 00:05:49.776 10:01:11 -- common/autotest_common.sh@1553 -- # unvmcap=' 0' 00:05:49.776 10:01:11 -- common/autotest_common.sh@1554 -- # [[ 0 -eq 0 ]] 00:05:49.776 10:01:11 -- common/autotest_common.sh@1556 -- # continue 00:05:49.776 10:01:11 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:49.776 10:01:11 -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:49.776 10:01:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.776 10:01:11 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:49.776 10:01:11 -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:49.776 10:01:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.776 10:01:11 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:53.979 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:53.979 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:55.890 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:55.890 10:01:17 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:55.890 10:01:17 -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:55.890 10:01:17 -- common/autotest_common.sh@10 -- # set +x 00:05:55.890 10:01:17 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:55.890 10:01:17 -- common/autotest_common.sh@1590 -- # mapfile -t bdfs 00:05:55.890 10:01:17 -- common/autotest_common.sh@1590 -- # get_nvme_bdfs_by_id 0x0a54 00:05:55.890 10:01:17 -- common/autotest_common.sh@1576 -- # bdfs=() 00:05:55.890 10:01:17 -- common/autotest_common.sh@1576 -- # local bdfs 00:05:55.890 10:01:17 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs 00:05:55.890 10:01:17 -- common/autotest_common.sh@1512 -- # bdfs=() 00:05:55.890 10:01:17 -- common/autotest_common.sh@1512 -- # local bdfs 00:05:55.890 10:01:17 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:55.890 10:01:17 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:55.890 10:01:17 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:05:55.890 10:01:17 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:05:55.890 10:01:17 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:65:00.0 00:05:55.890 10:01:17 -- common/autotest_common.sh@1578 -- # for bdf in $(get_nvme_bdfs) 00:05:55.890 10:01:17 -- common/autotest_common.sh@1579 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:05:55.890 10:01:17 -- common/autotest_common.sh@1579 -- # device=0x0a54 00:05:55.890 10:01:17 -- common/autotest_common.sh@1580 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:55.890 10:01:17 -- common/autotest_common.sh@1581 -- # bdfs+=($bdf) 00:05:55.890 10:01:17 -- common/autotest_common.sh@1585 -- # printf '%s\n' 0000:65:00.0 00:05:55.890 10:01:17 -- common/autotest_common.sh@1591 -- # [[ -z 0000:65:00.0 ]] 00:05:55.890 10:01:17 -- common/autotest_common.sh@1596 -- # spdk_tgt_pid=905698 00:05:55.890 10:01:17 -- common/autotest_common.sh@1597 -- # waitforlisten 905698 00:05:55.890 10:01:17 -- common/autotest_common.sh@1595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:55.890 10:01:17 -- common/autotest_common.sh@830 -- # '[' -z 905698 ']' 00:05:55.890 10:01:17 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.891 10:01:17 -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:55.891 10:01:17 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.891 10:01:17 -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:55.891 10:01:17 -- common/autotest_common.sh@10 -- # set +x 00:05:55.891 [2024-06-10 10:01:17.688765] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:05:55.891 [2024-06-10 10:01:17.688836] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid905698 ] 00:05:56.151 [2024-06-10 10:01:17.772357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.151 [2024-06-10 10:01:17.867032] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.722 10:01:18 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:56.722 10:01:18 -- common/autotest_common.sh@863 -- # return 0 00:05:56.722 10:01:18 -- common/autotest_common.sh@1599 -- # bdf_id=0 00:05:56.722 10:01:18 -- common/autotest_common.sh@1600 -- # for bdf in "${bdfs[@]}" 00:05:56.722 10:01:18 -- common/autotest_common.sh@1601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:65:00.0 00:06:00.016 nvme0n1 00:06:00.016 10:01:21 -- common/autotest_common.sh@1603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:00.016 [2024-06-10 10:01:21.747187] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:00.016 request: 00:06:00.016 { 00:06:00.016 "nvme_ctrlr_name": "nvme0", 00:06:00.016 "password": "test", 00:06:00.016 "method": "bdev_nvme_opal_revert", 00:06:00.016 "req_id": 1 00:06:00.016 } 00:06:00.016 Got JSON-RPC error response 00:06:00.016 response: 00:06:00.016 { 00:06:00.016 "code": -32602, 00:06:00.016 "message": "Invalid parameters" 00:06:00.016 } 00:06:00.016 10:01:21 -- common/autotest_common.sh@1603 -- # true 00:06:00.016 10:01:21 -- common/autotest_common.sh@1604 -- # (( ++bdf_id )) 00:06:00.016 10:01:21 -- common/autotest_common.sh@1607 -- # killprocess 905698 00:06:00.016 10:01:21 -- common/autotest_common.sh@949 -- # '[' -z 905698 ']' 00:06:00.016 10:01:21 -- common/autotest_common.sh@953 -- # kill -0 905698 00:06:00.016 10:01:21 -- common/autotest_common.sh@954 -- # uname 00:06:00.016 10:01:21 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:00.016 10:01:21 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 905698 00:06:00.016 10:01:21 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:00.016 10:01:21 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:00.016 10:01:21 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 905698' 00:06:00.016 killing process with pid 905698 00:06:00.016 10:01:21 -- common/autotest_common.sh@968 -- # kill 905698 00:06:00.016 10:01:21 -- common/autotest_common.sh@973 -- # wait 905698 00:06:02.557 10:01:24 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:02.557 10:01:24 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:02.557 10:01:24 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:02.558 10:01:24 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:02.558 10:01:24 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:03.128 Restarting all devices. 00:06:07.333 lstat() error: No such file or directory 00:06:07.333 QAT Error: No GENERAL section found 00:06:07.333 Failed to configure qat_dev0 00:06:07.333 lstat() error: No such file or directory 00:06:07.333 QAT Error: No GENERAL section found 00:06:07.333 Failed to configure qat_dev1 00:06:07.333 lstat() error: No such file or directory 00:06:07.333 QAT Error: No GENERAL section found 00:06:07.333 Failed to configure qat_dev2 00:06:07.333 enable sriov 00:06:07.333 Checking status of all devices. 00:06:07.333 There is 3 QAT acceleration device(s) in the system: 00:06:07.333 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:cc:00.0, #accel: 5 #engines: 10 state: down 00:06:07.333 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:ce:00.0, #accel: 5 #engines: 10 state: down 00:06:07.333 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:d0:00.0, #accel: 5 #engines: 10 state: down 00:06:07.333 0000:cc:00.0 set to 16 VFs 00:06:07.594 0000:ce:00.0 set to 16 VFs 00:06:08.167 0000:d0:00.0 set to 16 VFs 00:06:08.428 Properly configured the qat device with driver uio_pci_generic. 00:06:08.428 10:01:30 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:08.428 10:01:30 -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:08.428 10:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:08.428 10:01:30 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:08.428 10:01:30 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:08.428 10:01:30 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:08.428 10:01:30 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:08.428 10:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:08.428 ************************************ 00:06:08.428 START TEST env 00:06:08.428 ************************************ 00:06:08.428 10:01:30 env -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:08.428 * Looking for test storage... 00:06:08.428 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:08.428 10:01:30 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:08.428 10:01:30 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:08.428 10:01:30 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:08.428 10:01:30 env -- common/autotest_common.sh@10 -- # set +x 00:06:08.428 ************************************ 00:06:08.428 START TEST env_memory 00:06:08.428 ************************************ 00:06:08.428 10:01:30 env.env_memory -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:08.428 00:06:08.428 00:06:08.428 CUnit - A unit testing framework for C - Version 2.1-3 00:06:08.428 http://cunit.sourceforge.net/ 00:06:08.428 00:06:08.428 00:06:08.428 Suite: memory 00:06:08.690 Test: alloc and free memory map ...[2024-06-10 10:01:30.313978] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:08.690 passed 00:06:08.690 Test: mem map translation ...[2024-06-10 10:01:30.337745] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:08.690 [2024-06-10 10:01:30.337771] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:08.690 [2024-06-10 10:01:30.337817] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:08.690 [2024-06-10 10:01:30.337828] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:08.690 passed 00:06:08.690 Test: mem map registration ...[2024-06-10 10:01:30.388868] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:08.690 [2024-06-10 10:01:30.388888] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:08.690 passed 00:06:08.690 Test: mem map adjacent registrations ...passed 00:06:08.690 00:06:08.690 Run Summary: Type Total Ran Passed Failed Inactive 00:06:08.690 suites 1 1 n/a 0 0 00:06:08.690 tests 4 4 4 0 0 00:06:08.690 asserts 152 152 152 0 n/a 00:06:08.690 00:06:08.690 Elapsed time = 0.181 seconds 00:06:08.690 00:06:08.690 real 0m0.195s 00:06:08.690 user 0m0.186s 00:06:08.690 sys 0m0.008s 00:06:08.690 10:01:30 env.env_memory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:08.690 10:01:30 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:08.690 ************************************ 00:06:08.690 END TEST env_memory 00:06:08.690 ************************************ 00:06:08.690 10:01:30 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:08.690 10:01:30 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:08.690 10:01:30 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:08.690 10:01:30 env -- common/autotest_common.sh@10 -- # set +x 00:06:08.690 ************************************ 00:06:08.690 START TEST env_vtophys 00:06:08.690 ************************************ 00:06:08.690 10:01:30 env.env_vtophys -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:08.952 EAL: lib.eal log level changed from notice to debug 00:06:08.952 EAL: Detected lcore 0 as core 0 on socket 0 00:06:08.952 EAL: Detected lcore 1 as core 1 on socket 0 00:06:08.952 EAL: Detected lcore 2 as core 2 on socket 0 00:06:08.952 EAL: Detected lcore 3 as core 3 on socket 0 00:06:08.952 EAL: Detected lcore 4 as core 4 on socket 0 00:06:08.952 EAL: Detected lcore 5 as core 5 on socket 0 00:06:08.952 EAL: Detected lcore 6 as core 6 on socket 0 00:06:08.952 EAL: Detected lcore 7 as core 7 on socket 0 00:06:08.952 EAL: Detected lcore 8 as core 8 on socket 0 00:06:08.952 EAL: Detected lcore 9 as core 9 on socket 0 00:06:08.952 EAL: Detected lcore 10 as core 10 on socket 0 00:06:08.952 EAL: Detected lcore 11 as core 11 on socket 0 00:06:08.952 EAL: Detected lcore 12 as core 12 on socket 0 00:06:08.952 EAL: Detected lcore 13 as core 13 on socket 0 00:06:08.952 EAL: Detected lcore 14 as core 14 on socket 0 00:06:08.952 EAL: Detected lcore 15 as core 15 on socket 0 00:06:08.952 EAL: Detected lcore 16 as core 16 on socket 0 00:06:08.952 EAL: Detected lcore 17 as core 17 on socket 0 00:06:08.952 EAL: Detected lcore 18 as core 18 on socket 0 00:06:08.952 EAL: Detected lcore 19 as core 19 on socket 0 00:06:08.952 EAL: Detected lcore 20 as core 20 on socket 0 00:06:08.952 EAL: Detected lcore 21 as core 21 on socket 0 00:06:08.952 EAL: Detected lcore 22 as core 22 on socket 0 00:06:08.952 EAL: Detected lcore 23 as core 23 on socket 0 00:06:08.952 EAL: Detected lcore 24 as core 24 on socket 0 00:06:08.952 EAL: Detected lcore 25 as core 25 on socket 0 00:06:08.952 EAL: Detected lcore 26 as core 26 on socket 0 00:06:08.952 EAL: Detected lcore 27 as core 27 on socket 0 00:06:08.952 EAL: Detected lcore 28 as core 28 on socket 0 00:06:08.952 EAL: Detected lcore 29 as core 29 on socket 0 00:06:08.952 EAL: Detected lcore 30 as core 30 on socket 0 00:06:08.952 EAL: Detected lcore 31 as core 31 on socket 0 00:06:08.952 EAL: Detected lcore 32 as core 0 on socket 1 00:06:08.952 EAL: Detected lcore 33 as core 1 on socket 1 00:06:08.952 EAL: Detected lcore 34 as core 2 on socket 1 00:06:08.952 EAL: Detected lcore 35 as core 3 on socket 1 00:06:08.952 EAL: Detected lcore 36 as core 4 on socket 1 00:06:08.952 EAL: Detected lcore 37 as core 5 on socket 1 00:06:08.952 EAL: Detected lcore 38 as core 6 on socket 1 00:06:08.952 EAL: Detected lcore 39 as core 7 on socket 1 00:06:08.952 EAL: Detected lcore 40 as core 8 on socket 1 00:06:08.952 EAL: Detected lcore 41 as core 9 on socket 1 00:06:08.952 EAL: Detected lcore 42 as core 10 on socket 1 00:06:08.952 EAL: Detected lcore 43 as core 11 on socket 1 00:06:08.952 EAL: Detected lcore 44 as core 12 on socket 1 00:06:08.952 EAL: Detected lcore 45 as core 13 on socket 1 00:06:08.952 EAL: Detected lcore 46 as core 14 on socket 1 00:06:08.952 EAL: Detected lcore 47 as core 15 on socket 1 00:06:08.952 EAL: Detected lcore 48 as core 16 on socket 1 00:06:08.952 EAL: Detected lcore 49 as core 17 on socket 1 00:06:08.952 EAL: Detected lcore 50 as core 18 on socket 1 00:06:08.952 EAL: Detected lcore 51 as core 19 on socket 1 00:06:08.952 EAL: Detected lcore 52 as core 20 on socket 1 00:06:08.952 EAL: Detected lcore 53 as core 21 on socket 1 00:06:08.952 EAL: Detected lcore 54 as core 22 on socket 1 00:06:08.952 EAL: Detected lcore 55 as core 23 on socket 1 00:06:08.952 EAL: Detected lcore 56 as core 24 on socket 1 00:06:08.952 EAL: Detected lcore 57 as core 25 on socket 1 00:06:08.952 EAL: Detected lcore 58 as core 26 on socket 1 00:06:08.952 EAL: Detected lcore 59 as core 27 on socket 1 00:06:08.952 EAL: Detected lcore 60 as core 28 on socket 1 00:06:08.952 EAL: Detected lcore 61 as core 29 on socket 1 00:06:08.952 EAL: Detected lcore 62 as core 30 on socket 1 00:06:08.952 EAL: Detected lcore 63 as core 31 on socket 1 00:06:08.952 EAL: Detected lcore 64 as core 0 on socket 0 00:06:08.952 EAL: Detected lcore 65 as core 1 on socket 0 00:06:08.952 EAL: Detected lcore 66 as core 2 on socket 0 00:06:08.952 EAL: Detected lcore 67 as core 3 on socket 0 00:06:08.952 EAL: Detected lcore 68 as core 4 on socket 0 00:06:08.952 EAL: Detected lcore 69 as core 5 on socket 0 00:06:08.952 EAL: Detected lcore 70 as core 6 on socket 0 00:06:08.952 EAL: Detected lcore 71 as core 7 on socket 0 00:06:08.952 EAL: Detected lcore 72 as core 8 on socket 0 00:06:08.952 EAL: Detected lcore 73 as core 9 on socket 0 00:06:08.952 EAL: Detected lcore 74 as core 10 on socket 0 00:06:08.952 EAL: Detected lcore 75 as core 11 on socket 0 00:06:08.952 EAL: Detected lcore 76 as core 12 on socket 0 00:06:08.952 EAL: Detected lcore 77 as core 13 on socket 0 00:06:08.952 EAL: Detected lcore 78 as core 14 on socket 0 00:06:08.952 EAL: Detected lcore 79 as core 15 on socket 0 00:06:08.952 EAL: Detected lcore 80 as core 16 on socket 0 00:06:08.952 EAL: Detected lcore 81 as core 17 on socket 0 00:06:08.952 EAL: Detected lcore 82 as core 18 on socket 0 00:06:08.952 EAL: Detected lcore 83 as core 19 on socket 0 00:06:08.952 EAL: Detected lcore 84 as core 20 on socket 0 00:06:08.952 EAL: Detected lcore 85 as core 21 on socket 0 00:06:08.952 EAL: Detected lcore 86 as core 22 on socket 0 00:06:08.952 EAL: Detected lcore 87 as core 23 on socket 0 00:06:08.952 EAL: Detected lcore 88 as core 24 on socket 0 00:06:08.952 EAL: Detected lcore 89 as core 25 on socket 0 00:06:08.952 EAL: Detected lcore 90 as core 26 on socket 0 00:06:08.952 EAL: Detected lcore 91 as core 27 on socket 0 00:06:08.952 EAL: Detected lcore 92 as core 28 on socket 0 00:06:08.952 EAL: Detected lcore 93 as core 29 on socket 0 00:06:08.952 EAL: Detected lcore 94 as core 30 on socket 0 00:06:08.952 EAL: Detected lcore 95 as core 31 on socket 0 00:06:08.952 EAL: Detected lcore 96 as core 0 on socket 1 00:06:08.952 EAL: Detected lcore 97 as core 1 on socket 1 00:06:08.952 EAL: Detected lcore 98 as core 2 on socket 1 00:06:08.952 EAL: Detected lcore 99 as core 3 on socket 1 00:06:08.952 EAL: Detected lcore 100 as core 4 on socket 1 00:06:08.952 EAL: Detected lcore 101 as core 5 on socket 1 00:06:08.952 EAL: Detected lcore 102 as core 6 on socket 1 00:06:08.952 EAL: Detected lcore 103 as core 7 on socket 1 00:06:08.952 EAL: Detected lcore 104 as core 8 on socket 1 00:06:08.952 EAL: Detected lcore 105 as core 9 on socket 1 00:06:08.952 EAL: Detected lcore 106 as core 10 on socket 1 00:06:08.953 EAL: Detected lcore 107 as core 11 on socket 1 00:06:08.953 EAL: Detected lcore 108 as core 12 on socket 1 00:06:08.953 EAL: Detected lcore 109 as core 13 on socket 1 00:06:08.953 EAL: Detected lcore 110 as core 14 on socket 1 00:06:08.953 EAL: Detected lcore 111 as core 15 on socket 1 00:06:08.953 EAL: Detected lcore 112 as core 16 on socket 1 00:06:08.953 EAL: Detected lcore 113 as core 17 on socket 1 00:06:08.953 EAL: Detected lcore 114 as core 18 on socket 1 00:06:08.953 EAL: Detected lcore 115 as core 19 on socket 1 00:06:08.953 EAL: Detected lcore 116 as core 20 on socket 1 00:06:08.953 EAL: Detected lcore 117 as core 21 on socket 1 00:06:08.953 EAL: Detected lcore 118 as core 22 on socket 1 00:06:08.953 EAL: Detected lcore 119 as core 23 on socket 1 00:06:08.953 EAL: Detected lcore 120 as core 24 on socket 1 00:06:08.953 EAL: Detected lcore 121 as core 25 on socket 1 00:06:08.953 EAL: Detected lcore 122 as core 26 on socket 1 00:06:08.953 EAL: Detected lcore 123 as core 27 on socket 1 00:06:08.953 EAL: Detected lcore 124 as core 28 on socket 1 00:06:08.953 EAL: Detected lcore 125 as core 29 on socket 1 00:06:08.953 EAL: Detected lcore 126 as core 30 on socket 1 00:06:08.953 EAL: Detected lcore 127 as core 31 on socket 1 00:06:08.953 EAL: Maximum logical cores by configuration: 128 00:06:08.953 EAL: Detected CPU lcores: 128 00:06:08.953 EAL: Detected NUMA nodes: 2 00:06:08.953 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:08.953 EAL: Detected shared linkage of DPDK 00:06:08.953 EAL: No shared files mode enabled, IPC will be disabled 00:06:08.953 EAL: No shared files mode enabled, IPC is disabled 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:01.7 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:cc:02.7 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:01.7 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:ce:02.7 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:01.7 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.0 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.1 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.2 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.3 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.4 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.5 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.6 wants IOVA as 'PA' 00:06:08.953 EAL: PCI driver qat for device 0000:d0:02.7 wants IOVA as 'PA' 00:06:08.953 EAL: Bus pci wants IOVA as 'PA' 00:06:08.953 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:08.953 EAL: Bus vdev wants IOVA as 'DC' 00:06:08.953 EAL: Selected IOVA mode 'PA' 00:06:08.953 EAL: Probing VFIO support... 00:06:08.953 EAL: IOMMU type 1 (Type 1) is supported 00:06:08.953 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:08.953 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:08.953 EAL: VFIO support initialized 00:06:08.953 EAL: Ask a virtual area of 0x2e000 bytes 00:06:08.953 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:08.953 EAL: Setting up physically contiguous memory... 00:06:08.953 EAL: Setting maximum number of open files to 524288 00:06:08.953 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:08.953 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:08.953 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:08.953 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:08.953 EAL: Ask a virtual area of 0x61000 bytes 00:06:08.953 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:08.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:08.953 EAL: Ask a virtual area of 0x400000000 bytes 00:06:08.953 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:08.953 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:08.953 EAL: Hugepages will be freed exactly as allocated. 00:06:08.953 EAL: No shared files mode enabled, IPC is disabled 00:06:08.953 EAL: No shared files mode enabled, IPC is disabled 00:06:08.953 EAL: TSC frequency is ~2600000 KHz 00:06:08.953 EAL: Main lcore 0 is ready (tid=7f58ae6eab00;cpuset=[0]) 00:06:08.953 EAL: Trying to obtain current memory policy. 00:06:08.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.953 EAL: Restoring previous memory policy: 0 00:06:08.953 EAL: request: mp_malloc_sync 00:06:08.953 EAL: No shared files mode enabled, IPC is disabled 00:06:08.953 EAL: Heap on socket 0 was expanded by 2MB 00:06:08.953 EAL: PCI device 0000:cc:01.0 on NUMA socket 1 00:06:08.953 EAL: probe driver: 8086:37c9 qat 00:06:08.953 EAL: PCI memory mapped at 0x202001000000 00:06:08.953 EAL: PCI memory mapped at 0x202001001000 00:06:08.953 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:08.953 EAL: Trying to obtain current memory policy. 00:06:08.953 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:08.953 EAL: Restoring previous memory policy: 4 00:06:08.953 EAL: request: mp_malloc_sync 00:06:08.953 EAL: No shared files mode enabled, IPC is disabled 00:06:08.953 EAL: Heap on socket 1 was expanded by 2MB 00:06:08.953 EAL: PCI device 0000:cc:01.1 on NUMA socket 1 00:06:08.953 EAL: probe driver: 8086:37c9 qat 00:06:08.953 EAL: PCI memory mapped at 0x202001002000 00:06:08.953 EAL: PCI memory mapped at 0x202001003000 00:06:08.953 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:08.953 EAL: PCI device 0000:cc:01.2 on NUMA socket 1 00:06:08.953 EAL: probe driver: 8086:37c9 qat 00:06:08.953 EAL: PCI memory mapped at 0x202001004000 00:06:08.954 EAL: PCI memory mapped at 0x202001005000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:01.3 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001006000 00:06:08.954 EAL: PCI memory mapped at 0x202001007000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:01.4 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001008000 00:06:08.954 EAL: PCI memory mapped at 0x202001009000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:01.5 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200100a000 00:06:08.954 EAL: PCI memory mapped at 0x20200100b000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:01.6 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200100c000 00:06:08.954 EAL: PCI memory mapped at 0x20200100d000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:01.7 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200100e000 00:06:08.954 EAL: PCI memory mapped at 0x20200100f000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.0 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001010000 00:06:08.954 EAL: PCI memory mapped at 0x202001011000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.1 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001012000 00:06:08.954 EAL: PCI memory mapped at 0x202001013000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.2 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001014000 00:06:08.954 EAL: PCI memory mapped at 0x202001015000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.3 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001016000 00:06:08.954 EAL: PCI memory mapped at 0x202001017000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.4 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001018000 00:06:08.954 EAL: PCI memory mapped at 0x202001019000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.5 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200101a000 00:06:08.954 EAL: PCI memory mapped at 0x20200101b000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.6 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200101c000 00:06:08.954 EAL: PCI memory mapped at 0x20200101d000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:08.954 EAL: PCI device 0000:cc:02.7 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200101e000 00:06:08.954 EAL: PCI memory mapped at 0x20200101f000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.0 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001020000 00:06:08.954 EAL: PCI memory mapped at 0x202001021000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.1 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001022000 00:06:08.954 EAL: PCI memory mapped at 0x202001023000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.2 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001024000 00:06:08.954 EAL: PCI memory mapped at 0x202001025000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.3 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001026000 00:06:08.954 EAL: PCI memory mapped at 0x202001027000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.4 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001028000 00:06:08.954 EAL: PCI memory mapped at 0x202001029000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.5 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200102a000 00:06:08.954 EAL: PCI memory mapped at 0x20200102b000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.6 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200102c000 00:06:08.954 EAL: PCI memory mapped at 0x20200102d000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:01.7 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200102e000 00:06:08.954 EAL: PCI memory mapped at 0x20200102f000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.0 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001030000 00:06:08.954 EAL: PCI memory mapped at 0x202001031000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.1 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001032000 00:06:08.954 EAL: PCI memory mapped at 0x202001033000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.2 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001034000 00:06:08.954 EAL: PCI memory mapped at 0x202001035000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.3 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001036000 00:06:08.954 EAL: PCI memory mapped at 0x202001037000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.4 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001038000 00:06:08.954 EAL: PCI memory mapped at 0x202001039000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.5 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200103a000 00:06:08.954 EAL: PCI memory mapped at 0x20200103b000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.6 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200103c000 00:06:08.954 EAL: PCI memory mapped at 0x20200103d000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:08.954 EAL: PCI device 0000:ce:02.7 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200103e000 00:06:08.954 EAL: PCI memory mapped at 0x20200103f000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.0 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001040000 00:06:08.954 EAL: PCI memory mapped at 0x202001041000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.1 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001042000 00:06:08.954 EAL: PCI memory mapped at 0x202001043000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.2 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001044000 00:06:08.954 EAL: PCI memory mapped at 0x202001045000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.3 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001046000 00:06:08.954 EAL: PCI memory mapped at 0x202001047000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.4 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x202001048000 00:06:08.954 EAL: PCI memory mapped at 0x202001049000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.5 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200104a000 00:06:08.954 EAL: PCI memory mapped at 0x20200104b000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.6 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.954 EAL: PCI memory mapped at 0x20200104c000 00:06:08.954 EAL: PCI memory mapped at 0x20200104d000 00:06:08.954 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:08.954 EAL: PCI device 0000:d0:01.7 on NUMA socket 1 00:06:08.954 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x20200104e000 00:06:08.955 EAL: PCI memory mapped at 0x20200104f000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.0 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x202001050000 00:06:08.955 EAL: PCI memory mapped at 0x202001051000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.1 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x202001052000 00:06:08.955 EAL: PCI memory mapped at 0x202001053000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.2 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x202001054000 00:06:08.955 EAL: PCI memory mapped at 0x202001055000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.3 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x202001056000 00:06:08.955 EAL: PCI memory mapped at 0x202001057000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.4 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x202001058000 00:06:08.955 EAL: PCI memory mapped at 0x202001059000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.5 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x20200105a000 00:06:08.955 EAL: PCI memory mapped at 0x20200105b000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.6 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x20200105c000 00:06:08.955 EAL: PCI memory mapped at 0x20200105d000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:08.955 EAL: PCI device 0000:d0:02.7 on NUMA socket 1 00:06:08.955 EAL: probe driver: 8086:37c9 qat 00:06:08.955 EAL: PCI memory mapped at 0x20200105e000 00:06:08.955 EAL: PCI memory mapped at 0x20200105f000 00:06:08.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:08.955 EAL: Mem event callback 'spdk:(nil)' registered 00:06:08.955 00:06:08.955 00:06:08.955 CUnit - A unit testing framework for C - Version 2.1-3 00:06:08.955 http://cunit.sourceforge.net/ 00:06:08.955 00:06:08.955 00:06:08.955 Suite: components_suite 00:06:08.955 Test: vtophys_malloc_test ...passed 00:06:08.955 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 4MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 4MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 6MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 6MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 10MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 10MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 18MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 18MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 34MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 34MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 66MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 66MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 130MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was shrunk by 130MB 00:06:08.955 EAL: Trying to obtain current memory policy. 00:06:08.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.955 EAL: Restoring previous memory policy: 4 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.955 EAL: request: mp_malloc_sync 00:06:08.955 EAL: No shared files mode enabled, IPC is disabled 00:06:08.955 EAL: Heap on socket 0 was expanded by 258MB 00:06:08.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.216 EAL: request: mp_malloc_sync 00:06:09.216 EAL: No shared files mode enabled, IPC is disabled 00:06:09.216 EAL: Heap on socket 0 was shrunk by 258MB 00:06:09.216 EAL: Trying to obtain current memory policy. 00:06:09.216 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.216 EAL: Restoring previous memory policy: 4 00:06:09.216 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.216 EAL: request: mp_malloc_sync 00:06:09.216 EAL: No shared files mode enabled, IPC is disabled 00:06:09.216 EAL: Heap on socket 0 was expanded by 514MB 00:06:09.216 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.216 EAL: request: mp_malloc_sync 00:06:09.216 EAL: No shared files mode enabled, IPC is disabled 00:06:09.216 EAL: Heap on socket 0 was shrunk by 514MB 00:06:09.216 EAL: Trying to obtain current memory policy. 00:06:09.216 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.477 EAL: Restoring previous memory policy: 4 00:06:09.477 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.477 EAL: request: mp_malloc_sync 00:06:09.477 EAL: No shared files mode enabled, IPC is disabled 00:06:09.477 EAL: Heap on socket 0 was expanded by 1026MB 00:06:09.477 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.737 EAL: request: mp_malloc_sync 00:06:09.737 EAL: No shared files mode enabled, IPC is disabled 00:06:09.737 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:09.737 passed 00:06:09.737 00:06:09.737 Run Summary: Type Total Ran Passed Failed Inactive 00:06:09.737 suites 1 1 n/a 0 0 00:06:09.737 tests 2 2 2 0 0 00:06:09.737 asserts 6779 6779 6779 0 n/a 00:06:09.737 00:06:09.737 Elapsed time = 0.675 seconds 00:06:09.737 EAL: No shared files mode enabled, IPC is disabled 00:06:09.737 EAL: No shared files mode enabled, IPC is disabled 00:06:09.737 EAL: No shared files mode enabled, IPC is disabled 00:06:09.737 00:06:09.737 real 0m0.840s 00:06:09.737 user 0m0.433s 00:06:09.737 sys 0m0.371s 00:06:09.737 10:01:31 env.env_vtophys -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:09.737 10:01:31 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:09.737 ************************************ 00:06:09.737 END TEST env_vtophys 00:06:09.737 ************************************ 00:06:09.737 10:01:31 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:09.737 10:01:31 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:09.737 10:01:31 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:09.737 10:01:31 env -- common/autotest_common.sh@10 -- # set +x 00:06:09.737 ************************************ 00:06:09.737 START TEST env_pci 00:06:09.737 ************************************ 00:06:09.737 10:01:31 env.env_pci -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:09.737 00:06:09.737 00:06:09.737 CUnit - A unit testing framework for C - Version 2.1-3 00:06:09.737 http://cunit.sourceforge.net/ 00:06:09.737 00:06:09.737 00:06:09.737 Suite: pci 00:06:09.737 Test: pci_hook ...[2024-06-10 10:01:31.472017] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 908329 has claimed it 00:06:09.737 EAL: Cannot find device (10000:00:01.0) 00:06:09.737 EAL: Failed to attach device on primary process 00:06:09.737 passed 00:06:09.737 00:06:09.737 Run Summary: Type Total Ran Passed Failed Inactive 00:06:09.737 suites 1 1 n/a 0 0 00:06:09.738 tests 1 1 1 0 0 00:06:09.738 asserts 25 25 25 0 n/a 00:06:09.738 00:06:09.738 Elapsed time = 0.032 seconds 00:06:09.738 00:06:09.738 real 0m0.059s 00:06:09.738 user 0m0.027s 00:06:09.738 sys 0m0.032s 00:06:09.738 10:01:31 env.env_pci -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:09.738 10:01:31 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:09.738 ************************************ 00:06:09.738 END TEST env_pci 00:06:09.738 ************************************ 00:06:09.738 10:01:31 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:09.738 10:01:31 env -- env/env.sh@15 -- # uname 00:06:09.738 10:01:31 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:09.738 10:01:31 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:09.738 10:01:31 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:09.738 10:01:31 env -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:06:09.738 10:01:31 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:09.738 10:01:31 env -- common/autotest_common.sh@10 -- # set +x 00:06:09.738 ************************************ 00:06:09.738 START TEST env_dpdk_post_init 00:06:09.738 ************************************ 00:06:09.738 10:01:31 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:10.000 EAL: Detected CPU lcores: 128 00:06:10.000 EAL: Detected NUMA nodes: 2 00:06:10.000 EAL: Detected shared linkage of DPDK 00:06:10.000 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:10.000 EAL: Selected IOVA mode 'PA' 00:06:10.000 EAL: VFIO support initialized 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.000 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:10.000 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.001 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:10.001 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.001 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.002 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.002 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:10.002 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:10.002 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:10.002 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:10.002 EAL: Using IOMMU type 1 (Type 1) 00:06:10.002 EAL: Ignore mapping IO port bar(1) 00:06:10.263 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:06:10.263 EAL: Ignore mapping IO port bar(1) 00:06:10.523 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:06:10.523 EAL: Ignore mapping IO port bar(1) 00:06:10.784 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:06:10.784 EAL: Ignore mapping IO port bar(1) 00:06:10.784 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:06:11.045 EAL: Ignore mapping IO port bar(1) 00:06:11.045 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:06:11.306 EAL: Ignore mapping IO port bar(1) 00:06:11.306 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:06:11.566 EAL: Ignore mapping IO port bar(1) 00:06:11.566 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:06:11.566 EAL: Ignore mapping IO port bar(1) 00:06:11.831 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:06:12.404 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:65:00.0 (socket 0) 00:06:12.665 EAL: Ignore mapping IO port bar(1) 00:06:12.665 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:06:12.925 EAL: Ignore mapping IO port bar(1) 00:06:12.925 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:06:13.186 EAL: Ignore mapping IO port bar(1) 00:06:13.186 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:06:13.447 EAL: Ignore mapping IO port bar(1) 00:06:13.447 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:06:13.447 EAL: Ignore mapping IO port bar(1) 00:06:13.708 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:06:13.708 EAL: Ignore mapping IO port bar(1) 00:06:13.969 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:06:13.969 EAL: Ignore mapping IO port bar(1) 00:06:13.969 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:06:14.230 EAL: Ignore mapping IO port bar(1) 00:06:14.230 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:06:18.511 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:06:18.511 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:06:18.511 Starting DPDK initialization... 00:06:18.511 Starting SPDK post initialization... 00:06:18.511 SPDK NVMe probe 00:06:18.511 Attaching to 0000:65:00.0 00:06:18.511 Attached to 0000:65:00.0 00:06:18.511 Cleaning up... 00:06:20.419 00:06:20.419 real 0m10.269s 00:06:20.419 user 0m4.118s 00:06:20.419 sys 0m0.174s 00:06:20.419 10:01:41 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:20.419 10:01:41 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:20.419 ************************************ 00:06:20.419 END TEST env_dpdk_post_init 00:06:20.419 ************************************ 00:06:20.419 10:01:41 env -- env/env.sh@26 -- # uname 00:06:20.419 10:01:41 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:20.419 10:01:41 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:20.420 10:01:41 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:20.420 10:01:41 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:20.420 10:01:41 env -- common/autotest_common.sh@10 -- # set +x 00:06:20.420 ************************************ 00:06:20.420 START TEST env_mem_callbacks 00:06:20.420 ************************************ 00:06:20.420 10:01:41 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:20.420 EAL: Detected CPU lcores: 128 00:06:20.420 EAL: Detected NUMA nodes: 2 00:06:20.420 EAL: Detected shared linkage of DPDK 00:06:20.420 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:20.420 EAL: Selected IOVA mode 'PA' 00:06:20.420 EAL: VFIO support initialized 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:20.420 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:20.420 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:20.421 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:20.421 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:20.421 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:20.421 00:06:20.421 00:06:20.421 CUnit - A unit testing framework for C - Version 2.1-3 00:06:20.421 http://cunit.sourceforge.net/ 00:06:20.421 00:06:20.421 00:06:20.421 Suite: memory 00:06:20.421 Test: test ... 00:06:20.421 register 0x200000200000 2097152 00:06:20.421 register 0x201000a00000 2097152 00:06:20.421 malloc 3145728 00:06:20.421 register 0x200000400000 4194304 00:06:20.421 buf 0x200000500000 len 3145728 PASSED 00:06:20.421 malloc 64 00:06:20.421 buf 0x2000004fff40 len 64 PASSED 00:06:20.421 malloc 4194304 00:06:20.421 register 0x200000800000 6291456 00:06:20.421 buf 0x200000a00000 len 4194304 PASSED 00:06:20.421 free 0x200000500000 3145728 00:06:20.421 free 0x2000004fff40 64 00:06:20.421 unregister 0x200000400000 4194304 PASSED 00:06:20.421 free 0x200000a00000 4194304 00:06:20.421 unregister 0x200000800000 6291456 PASSED 00:06:20.421 malloc 8388608 00:06:20.421 register 0x200000400000 10485760 00:06:20.421 buf 0x200000600000 len 8388608 PASSED 00:06:20.421 free 0x200000600000 8388608 00:06:20.421 unregister 0x200000400000 10485760 PASSED 00:06:20.421 passed 00:06:20.421 00:06:20.421 Run Summary: Type Total Ran Passed Failed Inactive 00:06:20.421 suites 1 1 n/a 0 0 00:06:20.421 tests 1 1 1 0 0 00:06:20.421 asserts 16 16 16 0 n/a 00:06:20.421 00:06:20.421 Elapsed time = 0.008 seconds 00:06:20.421 00:06:20.421 real 0m0.089s 00:06:20.421 user 0m0.027s 00:06:20.421 sys 0m0.062s 00:06:20.421 10:01:42 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:20.421 10:01:42 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:20.421 ************************************ 00:06:20.421 END TEST env_mem_callbacks 00:06:20.421 ************************************ 00:06:20.421 00:06:20.421 real 0m11.953s 00:06:20.421 user 0m4.976s 00:06:20.421 sys 0m0.994s 00:06:20.421 10:01:42 env -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:20.421 10:01:42 env -- common/autotest_common.sh@10 -- # set +x 00:06:20.421 ************************************ 00:06:20.421 END TEST env 00:06:20.421 ************************************ 00:06:20.421 10:01:42 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:20.421 10:01:42 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:20.421 10:01:42 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:20.421 10:01:42 -- common/autotest_common.sh@10 -- # set +x 00:06:20.421 ************************************ 00:06:20.421 START TEST rpc 00:06:20.421 ************************************ 00:06:20.421 10:01:42 rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:20.421 * Looking for test storage... 00:06:20.421 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.421 10:01:42 rpc -- rpc/rpc.sh@65 -- # spdk_pid=910269 00:06:20.421 10:01:42 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:20.421 10:01:42 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:20.421 10:01:42 rpc -- rpc/rpc.sh@67 -- # waitforlisten 910269 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@830 -- # '[' -z 910269 ']' 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.422 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:20.422 10:01:42 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.681 [2024-06-10 10:01:42.319627] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:20.681 [2024-06-10 10:01:42.319681] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid910269 ] 00:06:20.681 [2024-06-10 10:01:42.410745] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.681 [2024-06-10 10:01:42.476415] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:20.681 [2024-06-10 10:01:42.476454] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 910269' to capture a snapshot of events at runtime. 00:06:20.681 [2024-06-10 10:01:42.476461] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:20.681 [2024-06-10 10:01:42.476467] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:20.681 [2024-06-10 10:01:42.476472] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid910269 for offline analysis/debug. 00:06:20.681 [2024-06-10 10:01:42.476498] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.622 10:01:43 rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:21.622 10:01:43 rpc -- common/autotest_common.sh@863 -- # return 0 00:06:21.622 10:01:43 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:21.622 10:01:43 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:21.622 10:01:43 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:21.622 10:01:43 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:21.622 10:01:43 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:21.622 10:01:43 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:21.622 10:01:43 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.622 ************************************ 00:06:21.622 START TEST rpc_integrity 00:06:21.622 ************************************ 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.622 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.622 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:21.622 { 00:06:21.622 "name": "Malloc0", 00:06:21.622 "aliases": [ 00:06:21.622 "9776ecde-6104-41ee-b556-2a74a22fa84e" 00:06:21.622 ], 00:06:21.622 "product_name": "Malloc disk", 00:06:21.622 "block_size": 512, 00:06:21.622 "num_blocks": 16384, 00:06:21.622 "uuid": "9776ecde-6104-41ee-b556-2a74a22fa84e", 00:06:21.622 "assigned_rate_limits": { 00:06:21.622 "rw_ios_per_sec": 0, 00:06:21.622 "rw_mbytes_per_sec": 0, 00:06:21.623 "r_mbytes_per_sec": 0, 00:06:21.623 "w_mbytes_per_sec": 0 00:06:21.623 }, 00:06:21.623 "claimed": false, 00:06:21.623 "zoned": false, 00:06:21.623 "supported_io_types": { 00:06:21.623 "read": true, 00:06:21.623 "write": true, 00:06:21.623 "unmap": true, 00:06:21.623 "write_zeroes": true, 00:06:21.623 "flush": true, 00:06:21.623 "reset": true, 00:06:21.623 "compare": false, 00:06:21.623 "compare_and_write": false, 00:06:21.623 "abort": true, 00:06:21.623 "nvme_admin": false, 00:06:21.623 "nvme_io": false 00:06:21.623 }, 00:06:21.623 "memory_domains": [ 00:06:21.623 { 00:06:21.623 "dma_device_id": "system", 00:06:21.623 "dma_device_type": 1 00:06:21.623 }, 00:06:21.623 { 00:06:21.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.623 "dma_device_type": 2 00:06:21.623 } 00:06:21.623 ], 00:06:21.623 "driver_specific": {} 00:06:21.623 } 00:06:21.623 ]' 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 [2024-06-10 10:01:43.331896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:21.623 [2024-06-10 10:01:43.331927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:21.623 [2024-06-10 10:01:43.331939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27566d0 00:06:21.623 [2024-06-10 10:01:43.331946] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:21.623 [2024-06-10 10:01:43.333218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:21.623 [2024-06-10 10:01:43.333239] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:21.623 Passthru0 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:21.623 { 00:06:21.623 "name": "Malloc0", 00:06:21.623 "aliases": [ 00:06:21.623 "9776ecde-6104-41ee-b556-2a74a22fa84e" 00:06:21.623 ], 00:06:21.623 "product_name": "Malloc disk", 00:06:21.623 "block_size": 512, 00:06:21.623 "num_blocks": 16384, 00:06:21.623 "uuid": "9776ecde-6104-41ee-b556-2a74a22fa84e", 00:06:21.623 "assigned_rate_limits": { 00:06:21.623 "rw_ios_per_sec": 0, 00:06:21.623 "rw_mbytes_per_sec": 0, 00:06:21.623 "r_mbytes_per_sec": 0, 00:06:21.623 "w_mbytes_per_sec": 0 00:06:21.623 }, 00:06:21.623 "claimed": true, 00:06:21.623 "claim_type": "exclusive_write", 00:06:21.623 "zoned": false, 00:06:21.623 "supported_io_types": { 00:06:21.623 "read": true, 00:06:21.623 "write": true, 00:06:21.623 "unmap": true, 00:06:21.623 "write_zeroes": true, 00:06:21.623 "flush": true, 00:06:21.623 "reset": true, 00:06:21.623 "compare": false, 00:06:21.623 "compare_and_write": false, 00:06:21.623 "abort": true, 00:06:21.623 "nvme_admin": false, 00:06:21.623 "nvme_io": false 00:06:21.623 }, 00:06:21.623 "memory_domains": [ 00:06:21.623 { 00:06:21.623 "dma_device_id": "system", 00:06:21.623 "dma_device_type": 1 00:06:21.623 }, 00:06:21.623 { 00:06:21.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.623 "dma_device_type": 2 00:06:21.623 } 00:06:21.623 ], 00:06:21.623 "driver_specific": {} 00:06:21.623 }, 00:06:21.623 { 00:06:21.623 "name": "Passthru0", 00:06:21.623 "aliases": [ 00:06:21.623 "5462a508-9e99-5285-aaaf-b30e089df160" 00:06:21.623 ], 00:06:21.623 "product_name": "passthru", 00:06:21.623 "block_size": 512, 00:06:21.623 "num_blocks": 16384, 00:06:21.623 "uuid": "5462a508-9e99-5285-aaaf-b30e089df160", 00:06:21.623 "assigned_rate_limits": { 00:06:21.623 "rw_ios_per_sec": 0, 00:06:21.623 "rw_mbytes_per_sec": 0, 00:06:21.623 "r_mbytes_per_sec": 0, 00:06:21.623 "w_mbytes_per_sec": 0 00:06:21.623 }, 00:06:21.623 "claimed": false, 00:06:21.623 "zoned": false, 00:06:21.623 "supported_io_types": { 00:06:21.623 "read": true, 00:06:21.623 "write": true, 00:06:21.623 "unmap": true, 00:06:21.623 "write_zeroes": true, 00:06:21.623 "flush": true, 00:06:21.623 "reset": true, 00:06:21.623 "compare": false, 00:06:21.623 "compare_and_write": false, 00:06:21.623 "abort": true, 00:06:21.623 "nvme_admin": false, 00:06:21.623 "nvme_io": false 00:06:21.623 }, 00:06:21.623 "memory_domains": [ 00:06:21.623 { 00:06:21.623 "dma_device_id": "system", 00:06:21.623 "dma_device_type": 1 00:06:21.623 }, 00:06:21.623 { 00:06:21.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.623 "dma_device_type": 2 00:06:21.623 } 00:06:21.623 ], 00:06:21.623 "driver_specific": { 00:06:21.623 "passthru": { 00:06:21.623 "name": "Passthru0", 00:06:21.623 "base_bdev_name": "Malloc0" 00:06:21.623 } 00:06:21.623 } 00:06:21.623 } 00:06:21.623 ]' 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:21.623 10:01:43 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:21.623 00:06:21.623 real 0m0.299s 00:06:21.623 user 0m0.188s 00:06:21.623 sys 0m0.040s 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:21.623 10:01:43 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.623 ************************************ 00:06:21.623 END TEST rpc_integrity 00:06:21.623 ************************************ 00:06:21.883 10:01:43 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:21.883 10:01:43 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:21.883 10:01:43 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:21.883 10:01:43 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.883 ************************************ 00:06:21.883 START TEST rpc_plugins 00:06:21.883 ************************************ 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # rpc_plugins 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:21.883 { 00:06:21.883 "name": "Malloc1", 00:06:21.883 "aliases": [ 00:06:21.883 "7767c5b2-d8b6-43b0-8013-5f8fda9cba81" 00:06:21.883 ], 00:06:21.883 "product_name": "Malloc disk", 00:06:21.883 "block_size": 4096, 00:06:21.883 "num_blocks": 256, 00:06:21.883 "uuid": "7767c5b2-d8b6-43b0-8013-5f8fda9cba81", 00:06:21.883 "assigned_rate_limits": { 00:06:21.883 "rw_ios_per_sec": 0, 00:06:21.883 "rw_mbytes_per_sec": 0, 00:06:21.883 "r_mbytes_per_sec": 0, 00:06:21.883 "w_mbytes_per_sec": 0 00:06:21.883 }, 00:06:21.883 "claimed": false, 00:06:21.883 "zoned": false, 00:06:21.883 "supported_io_types": { 00:06:21.883 "read": true, 00:06:21.883 "write": true, 00:06:21.883 "unmap": true, 00:06:21.883 "write_zeroes": true, 00:06:21.883 "flush": true, 00:06:21.883 "reset": true, 00:06:21.883 "compare": false, 00:06:21.883 "compare_and_write": false, 00:06:21.883 "abort": true, 00:06:21.883 "nvme_admin": false, 00:06:21.883 "nvme_io": false 00:06:21.883 }, 00:06:21.883 "memory_domains": [ 00:06:21.883 { 00:06:21.883 "dma_device_id": "system", 00:06:21.883 "dma_device_type": 1 00:06:21.883 }, 00:06:21.883 { 00:06:21.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.883 "dma_device_type": 2 00:06:21.883 } 00:06:21.883 ], 00:06:21.883 "driver_specific": {} 00:06:21.883 } 00:06:21.883 ]' 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:21.883 10:01:43 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:21.883 00:06:21.883 real 0m0.138s 00:06:21.883 user 0m0.081s 00:06:21.883 sys 0m0.020s 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:21.883 10:01:43 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.884 ************************************ 00:06:21.884 END TEST rpc_plugins 00:06:21.884 ************************************ 00:06:21.884 10:01:43 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:21.884 10:01:43 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:21.884 10:01:43 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:21.884 10:01:43 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.144 ************************************ 00:06:22.144 START TEST rpc_trace_cmd_test 00:06:22.144 ************************************ 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # rpc_trace_cmd_test 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:22.144 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid910269", 00:06:22.144 "tpoint_group_mask": "0x8", 00:06:22.144 "iscsi_conn": { 00:06:22.144 "mask": "0x2", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "scsi": { 00:06:22.144 "mask": "0x4", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "bdev": { 00:06:22.144 "mask": "0x8", 00:06:22.144 "tpoint_mask": "0xffffffffffffffff" 00:06:22.144 }, 00:06:22.144 "nvmf_rdma": { 00:06:22.144 "mask": "0x10", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "nvmf_tcp": { 00:06:22.144 "mask": "0x20", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "ftl": { 00:06:22.144 "mask": "0x40", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "blobfs": { 00:06:22.144 "mask": "0x80", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "dsa": { 00:06:22.144 "mask": "0x200", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "thread": { 00:06:22.144 "mask": "0x400", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "nvme_pcie": { 00:06:22.144 "mask": "0x800", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "iaa": { 00:06:22.144 "mask": "0x1000", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "nvme_tcp": { 00:06:22.144 "mask": "0x2000", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "bdev_nvme": { 00:06:22.144 "mask": "0x4000", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 }, 00:06:22.144 "sock": { 00:06:22.144 "mask": "0x8000", 00:06:22.144 "tpoint_mask": "0x0" 00:06:22.144 } 00:06:22.144 }' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:22.144 10:01:43 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:22.404 10:01:44 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:22.404 00:06:22.405 real 0m0.250s 00:06:22.405 user 0m0.205s 00:06:22.405 sys 0m0.035s 00:06:22.405 10:01:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 ************************************ 00:06:22.405 END TEST rpc_trace_cmd_test 00:06:22.405 ************************************ 00:06:22.405 10:01:44 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:22.405 10:01:44 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:22.405 10:01:44 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:22.405 10:01:44 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:22.405 10:01:44 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:22.405 10:01:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 ************************************ 00:06:22.405 START TEST rpc_daemon_integrity 00:06:22.405 ************************************ 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:22.405 { 00:06:22.405 "name": "Malloc2", 00:06:22.405 "aliases": [ 00:06:22.405 "ba6549e4-394f-41b5-ae24-ab2b80cd03c9" 00:06:22.405 ], 00:06:22.405 "product_name": "Malloc disk", 00:06:22.405 "block_size": 512, 00:06:22.405 "num_blocks": 16384, 00:06:22.405 "uuid": "ba6549e4-394f-41b5-ae24-ab2b80cd03c9", 00:06:22.405 "assigned_rate_limits": { 00:06:22.405 "rw_ios_per_sec": 0, 00:06:22.405 "rw_mbytes_per_sec": 0, 00:06:22.405 "r_mbytes_per_sec": 0, 00:06:22.405 "w_mbytes_per_sec": 0 00:06:22.405 }, 00:06:22.405 "claimed": false, 00:06:22.405 "zoned": false, 00:06:22.405 "supported_io_types": { 00:06:22.405 "read": true, 00:06:22.405 "write": true, 00:06:22.405 "unmap": true, 00:06:22.405 "write_zeroes": true, 00:06:22.405 "flush": true, 00:06:22.405 "reset": true, 00:06:22.405 "compare": false, 00:06:22.405 "compare_and_write": false, 00:06:22.405 "abort": true, 00:06:22.405 "nvme_admin": false, 00:06:22.405 "nvme_io": false 00:06:22.405 }, 00:06:22.405 "memory_domains": [ 00:06:22.405 { 00:06:22.405 "dma_device_id": "system", 00:06:22.405 "dma_device_type": 1 00:06:22.405 }, 00:06:22.405 { 00:06:22.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.405 "dma_device_type": 2 00:06:22.405 } 00:06:22.405 ], 00:06:22.405 "driver_specific": {} 00:06:22.405 } 00:06:22.405 ]' 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 [2024-06-10 10:01:44.238333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:22.405 [2024-06-10 10:01:44.238360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:22.405 [2024-06-10 10:01:44.238372] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28edd60 00:06:22.405 [2024-06-10 10:01:44.238378] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:22.405 [2024-06-10 10:01:44.239520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:22.405 [2024-06-10 10:01:44.239539] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:22.405 Passthru0 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.405 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:22.405 { 00:06:22.405 "name": "Malloc2", 00:06:22.405 "aliases": [ 00:06:22.405 "ba6549e4-394f-41b5-ae24-ab2b80cd03c9" 00:06:22.405 ], 00:06:22.405 "product_name": "Malloc disk", 00:06:22.405 "block_size": 512, 00:06:22.405 "num_blocks": 16384, 00:06:22.405 "uuid": "ba6549e4-394f-41b5-ae24-ab2b80cd03c9", 00:06:22.405 "assigned_rate_limits": { 00:06:22.405 "rw_ios_per_sec": 0, 00:06:22.405 "rw_mbytes_per_sec": 0, 00:06:22.405 "r_mbytes_per_sec": 0, 00:06:22.405 "w_mbytes_per_sec": 0 00:06:22.405 }, 00:06:22.405 "claimed": true, 00:06:22.405 "claim_type": "exclusive_write", 00:06:22.405 "zoned": false, 00:06:22.405 "supported_io_types": { 00:06:22.405 "read": true, 00:06:22.405 "write": true, 00:06:22.405 "unmap": true, 00:06:22.405 "write_zeroes": true, 00:06:22.405 "flush": true, 00:06:22.405 "reset": true, 00:06:22.405 "compare": false, 00:06:22.405 "compare_and_write": false, 00:06:22.405 "abort": true, 00:06:22.405 "nvme_admin": false, 00:06:22.405 "nvme_io": false 00:06:22.405 }, 00:06:22.405 "memory_domains": [ 00:06:22.405 { 00:06:22.405 "dma_device_id": "system", 00:06:22.405 "dma_device_type": 1 00:06:22.405 }, 00:06:22.405 { 00:06:22.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.405 "dma_device_type": 2 00:06:22.405 } 00:06:22.405 ], 00:06:22.405 "driver_specific": {} 00:06:22.405 }, 00:06:22.405 { 00:06:22.405 "name": "Passthru0", 00:06:22.405 "aliases": [ 00:06:22.405 "2500d5f4-b90c-5d40-b0be-3036477b25bc" 00:06:22.405 ], 00:06:22.405 "product_name": "passthru", 00:06:22.405 "block_size": 512, 00:06:22.405 "num_blocks": 16384, 00:06:22.405 "uuid": "2500d5f4-b90c-5d40-b0be-3036477b25bc", 00:06:22.405 "assigned_rate_limits": { 00:06:22.405 "rw_ios_per_sec": 0, 00:06:22.405 "rw_mbytes_per_sec": 0, 00:06:22.405 "r_mbytes_per_sec": 0, 00:06:22.405 "w_mbytes_per_sec": 0 00:06:22.405 }, 00:06:22.405 "claimed": false, 00:06:22.405 "zoned": false, 00:06:22.405 "supported_io_types": { 00:06:22.405 "read": true, 00:06:22.405 "write": true, 00:06:22.405 "unmap": true, 00:06:22.405 "write_zeroes": true, 00:06:22.405 "flush": true, 00:06:22.405 "reset": true, 00:06:22.405 "compare": false, 00:06:22.405 "compare_and_write": false, 00:06:22.405 "abort": true, 00:06:22.405 "nvme_admin": false, 00:06:22.405 "nvme_io": false 00:06:22.405 }, 00:06:22.405 "memory_domains": [ 00:06:22.405 { 00:06:22.405 "dma_device_id": "system", 00:06:22.405 "dma_device_type": 1 00:06:22.405 }, 00:06:22.405 { 00:06:22.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.405 "dma_device_type": 2 00:06:22.405 } 00:06:22.405 ], 00:06:22.405 "driver_specific": { 00:06:22.405 "passthru": { 00:06:22.405 "name": "Passthru0", 00:06:22.405 "base_bdev_name": "Malloc2" 00:06:22.405 } 00:06:22.405 } 00:06:22.405 } 00:06:22.405 ]' 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:22.666 00:06:22.666 real 0m0.291s 00:06:22.666 user 0m0.187s 00:06:22.666 sys 0m0.040s 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:22.666 10:01:44 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.666 ************************************ 00:06:22.666 END TEST rpc_daemon_integrity 00:06:22.666 ************************************ 00:06:22.666 10:01:44 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:22.666 10:01:44 rpc -- rpc/rpc.sh@84 -- # killprocess 910269 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@949 -- # '[' -z 910269 ']' 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@953 -- # kill -0 910269 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@954 -- # uname 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 910269 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 910269' 00:06:22.666 killing process with pid 910269 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@968 -- # kill 910269 00:06:22.666 10:01:44 rpc -- common/autotest_common.sh@973 -- # wait 910269 00:06:22.926 00:06:22.926 real 0m2.520s 00:06:22.926 user 0m3.322s 00:06:22.926 sys 0m0.721s 00:06:22.926 10:01:44 rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:22.926 10:01:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.926 ************************************ 00:06:22.926 END TEST rpc 00:06:22.926 ************************************ 00:06:22.926 10:01:44 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:22.926 10:01:44 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:22.926 10:01:44 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:22.926 10:01:44 -- common/autotest_common.sh@10 -- # set +x 00:06:22.926 ************************************ 00:06:22.926 START TEST skip_rpc 00:06:22.926 ************************************ 00:06:22.926 10:01:44 skip_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:23.186 * Looking for test storage... 00:06:23.186 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:23.186 10:01:44 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:23.186 10:01:44 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:23.186 10:01:44 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:23.186 10:01:44 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:23.186 10:01:44 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:23.186 10:01:44 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.186 ************************************ 00:06:23.186 START TEST skip_rpc 00:06:23.186 ************************************ 00:06:23.186 10:01:44 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # test_skip_rpc 00:06:23.186 10:01:44 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=910762 00:06:23.186 10:01:44 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:23.186 10:01:44 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:23.186 10:01:44 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:23.186 [2024-06-10 10:01:44.946747] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:23.186 [2024-06-10 10:01:44.946795] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid910762 ] 00:06:23.186 [2024-06-10 10:01:45.036039] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.445 [2024-06-10 10:01:45.111789] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@649 -- # local es=0 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # rpc_cmd spdk_get_version 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # es=1 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 910762 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@949 -- # '[' -z 910762 ']' 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # kill -0 910762 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # uname 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 910762 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 910762' 00:06:28.727 killing process with pid 910762 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # kill 910762 00:06:28.727 10:01:49 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # wait 910762 00:06:28.727 00:06:28.727 real 0m5.264s 00:06:28.727 user 0m5.023s 00:06:28.727 sys 0m0.253s 00:06:28.727 10:01:50 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:28.727 10:01:50 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.727 ************************************ 00:06:28.727 END TEST skip_rpc 00:06:28.727 ************************************ 00:06:28.727 10:01:50 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:28.727 10:01:50 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:28.727 10:01:50 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:28.727 10:01:50 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.727 ************************************ 00:06:28.727 START TEST skip_rpc_with_json 00:06:28.727 ************************************ 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_json 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=911705 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 911705 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@830 -- # '[' -z 911705 ']' 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:28.727 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:28.727 10:01:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.727 [2024-06-10 10:01:50.298882] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:28.727 [2024-06-10 10:01:50.298946] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid911705 ] 00:06:28.727 [2024-06-10 10:01:50.390566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.727 [2024-06-10 10:01:50.458234] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@863 -- # return 0 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.298 [2024-06-10 10:01:51.120749] nvmf_rpc.c:2558:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:29.298 request: 00:06:29.298 { 00:06:29.298 "trtype": "tcp", 00:06:29.298 "method": "nvmf_get_transports", 00:06:29.298 "req_id": 1 00:06:29.298 } 00:06:29.298 Got JSON-RPC error response 00:06:29.298 response: 00:06:29.298 { 00:06:29.298 "code": -19, 00:06:29.298 "message": "No such device" 00:06:29.298 } 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.298 [2024-06-10 10:01:51.132865] tcp.c: 716:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:29.298 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.559 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:29.559 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:29.559 { 00:06:29.559 "subsystems": [ 00:06:29.559 { 00:06:29.559 "subsystem": "keyring", 00:06:29.559 "config": [] 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "subsystem": "iobuf", 00:06:29.559 "config": [ 00:06:29.559 { 00:06:29.559 "method": "iobuf_set_options", 00:06:29.559 "params": { 00:06:29.559 "small_pool_count": 8192, 00:06:29.559 "large_pool_count": 1024, 00:06:29.559 "small_bufsize": 8192, 00:06:29.559 "large_bufsize": 135168 00:06:29.559 } 00:06:29.559 } 00:06:29.559 ] 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "subsystem": "sock", 00:06:29.559 "config": [ 00:06:29.559 { 00:06:29.559 "method": "sock_set_default_impl", 00:06:29.559 "params": { 00:06:29.559 "impl_name": "posix" 00:06:29.559 } 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "method": "sock_impl_set_options", 00:06:29.559 "params": { 00:06:29.559 "impl_name": "ssl", 00:06:29.559 "recv_buf_size": 4096, 00:06:29.559 "send_buf_size": 4096, 00:06:29.559 "enable_recv_pipe": true, 00:06:29.559 "enable_quickack": false, 00:06:29.559 "enable_placement_id": 0, 00:06:29.559 "enable_zerocopy_send_server": true, 00:06:29.559 "enable_zerocopy_send_client": false, 00:06:29.559 "zerocopy_threshold": 0, 00:06:29.559 "tls_version": 0, 00:06:29.559 "enable_ktls": false 00:06:29.559 } 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "method": "sock_impl_set_options", 00:06:29.559 "params": { 00:06:29.559 "impl_name": "posix", 00:06:29.559 "recv_buf_size": 2097152, 00:06:29.559 "send_buf_size": 2097152, 00:06:29.559 "enable_recv_pipe": true, 00:06:29.559 "enable_quickack": false, 00:06:29.559 "enable_placement_id": 0, 00:06:29.559 "enable_zerocopy_send_server": true, 00:06:29.559 "enable_zerocopy_send_client": false, 00:06:29.559 "zerocopy_threshold": 0, 00:06:29.559 "tls_version": 0, 00:06:29.559 "enable_ktls": false 00:06:29.559 } 00:06:29.559 } 00:06:29.559 ] 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "subsystem": "vmd", 00:06:29.559 "config": [] 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "subsystem": "accel", 00:06:29.559 "config": [ 00:06:29.559 { 00:06:29.559 "method": "accel_set_options", 00:06:29.559 "params": { 00:06:29.559 "small_cache_size": 128, 00:06:29.559 "large_cache_size": 16, 00:06:29.559 "task_count": 2048, 00:06:29.559 "sequence_count": 2048, 00:06:29.559 "buf_count": 2048 00:06:29.559 } 00:06:29.559 } 00:06:29.559 ] 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "subsystem": "bdev", 00:06:29.559 "config": [ 00:06:29.559 { 00:06:29.559 "method": "bdev_set_options", 00:06:29.559 "params": { 00:06:29.559 "bdev_io_pool_size": 65535, 00:06:29.559 "bdev_io_cache_size": 256, 00:06:29.559 "bdev_auto_examine": true, 00:06:29.559 "iobuf_small_cache_size": 128, 00:06:29.559 "iobuf_large_cache_size": 16 00:06:29.559 } 00:06:29.559 }, 00:06:29.559 { 00:06:29.559 "method": "bdev_raid_set_options", 00:06:29.559 "params": { 00:06:29.559 "process_window_size_kb": 1024 00:06:29.559 } 00:06:29.559 }, 00:06:29.560 { 00:06:29.560 "method": "bdev_iscsi_set_options", 00:06:29.560 "params": { 00:06:29.560 "timeout_sec": 30 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "bdev_nvme_set_options", 00:06:29.560 "params": { 00:06:29.560 "action_on_timeout": "none", 00:06:29.560 "timeout_us": 0, 00:06:29.560 "timeout_admin_us": 0, 00:06:29.560 "keep_alive_timeout_ms": 10000, 00:06:29.560 "arbitration_burst": 0, 00:06:29.560 "low_priority_weight": 0, 00:06:29.560 "medium_priority_weight": 0, 00:06:29.560 "high_priority_weight": 0, 00:06:29.560 "nvme_adminq_poll_period_us": 10000, 00:06:29.560 "nvme_ioq_poll_period_us": 0, 00:06:29.560 "io_queue_requests": 0, 00:06:29.560 "delay_cmd_submit": true, 00:06:29.560 "transport_retry_count": 4, 00:06:29.560 "bdev_retry_count": 3, 00:06:29.560 "transport_ack_timeout": 0, 00:06:29.560 "ctrlr_loss_timeout_sec": 0, 00:06:29.560 "reconnect_delay_sec": 0, 00:06:29.560 "fast_io_fail_timeout_sec": 0, 00:06:29.560 "disable_auto_failback": false, 00:06:29.560 "generate_uuids": false, 00:06:29.560 "transport_tos": 0, 00:06:29.560 "nvme_error_stat": false, 00:06:29.560 "rdma_srq_size": 0, 00:06:29.560 "io_path_stat": false, 00:06:29.560 "allow_accel_sequence": false, 00:06:29.560 "rdma_max_cq_size": 0, 00:06:29.560 "rdma_cm_event_timeout_ms": 0, 00:06:29.560 "dhchap_digests": [ 00:06:29.560 "sha256", 00:06:29.560 "sha384", 00:06:29.560 "sha512" 00:06:29.560 ], 00:06:29.560 "dhchap_dhgroups": [ 00:06:29.560 "null", 00:06:29.560 "ffdhe2048", 00:06:29.560 "ffdhe3072", 00:06:29.560 "ffdhe4096", 00:06:29.560 "ffdhe6144", 00:06:29.560 "ffdhe8192" 00:06:29.560 ] 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "bdev_nvme_set_hotplug", 00:06:29.560 "params": { 00:06:29.560 "period_us": 100000, 00:06:29.560 "enable": false 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "bdev_wait_for_examine" 00:06:29.560 } 00:06:29.560 ] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "scsi", 00:06:29.560 "config": null 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "scheduler", 00:06:29.560 "config": [ 00:06:29.560 { 00:06:29.560 "method": "framework_set_scheduler", 00:06:29.560 "params": { 00:06:29.560 "name": "static" 00:06:29.560 } 00:06:29.560 } 00:06:29.560 ] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "vhost_scsi", 00:06:29.560 "config": [] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "vhost_blk", 00:06:29.560 "config": [] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "ublk", 00:06:29.560 "config": [] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "nbd", 00:06:29.560 "config": [] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "nvmf", 00:06:29.560 "config": [ 00:06:29.560 { 00:06:29.560 "method": "nvmf_set_config", 00:06:29.560 "params": { 00:06:29.560 "discovery_filter": "match_any", 00:06:29.560 "admin_cmd_passthru": { 00:06:29.560 "identify_ctrlr": false 00:06:29.560 } 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "nvmf_set_max_subsystems", 00:06:29.560 "params": { 00:06:29.560 "max_subsystems": 1024 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "nvmf_set_crdt", 00:06:29.560 "params": { 00:06:29.560 "crdt1": 0, 00:06:29.560 "crdt2": 0, 00:06:29.560 "crdt3": 0 00:06:29.560 } 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "method": "nvmf_create_transport", 00:06:29.560 "params": { 00:06:29.560 "trtype": "TCP", 00:06:29.560 "max_queue_depth": 128, 00:06:29.560 "max_io_qpairs_per_ctrlr": 127, 00:06:29.560 "in_capsule_data_size": 4096, 00:06:29.560 "max_io_size": 131072, 00:06:29.560 "io_unit_size": 131072, 00:06:29.560 "max_aq_depth": 128, 00:06:29.560 "num_shared_buffers": 511, 00:06:29.560 "buf_cache_size": 4294967295, 00:06:29.560 "dif_insert_or_strip": false, 00:06:29.560 "zcopy": false, 00:06:29.560 "c2h_success": true, 00:06:29.560 "sock_priority": 0, 00:06:29.560 "abort_timeout_sec": 1, 00:06:29.560 "ack_timeout": 0, 00:06:29.560 "data_wr_pool_size": 0 00:06:29.560 } 00:06:29.560 } 00:06:29.560 ] 00:06:29.560 }, 00:06:29.560 { 00:06:29.560 "subsystem": "iscsi", 00:06:29.560 "config": [ 00:06:29.560 { 00:06:29.560 "method": "iscsi_set_options", 00:06:29.560 "params": { 00:06:29.560 "node_base": "iqn.2016-06.io.spdk", 00:06:29.560 "max_sessions": 128, 00:06:29.560 "max_connections_per_session": 2, 00:06:29.560 "max_queue_depth": 64, 00:06:29.560 "default_time2wait": 2, 00:06:29.560 "default_time2retain": 20, 00:06:29.560 "first_burst_length": 8192, 00:06:29.560 "immediate_data": true, 00:06:29.560 "allow_duplicated_isid": false, 00:06:29.560 "error_recovery_level": 0, 00:06:29.560 "nop_timeout": 60, 00:06:29.560 "nop_in_interval": 30, 00:06:29.560 "disable_chap": false, 00:06:29.560 "require_chap": false, 00:06:29.560 "mutual_chap": false, 00:06:29.560 "chap_group": 0, 00:06:29.560 "max_large_datain_per_connection": 64, 00:06:29.560 "max_r2t_per_connection": 4, 00:06:29.560 "pdu_pool_size": 36864, 00:06:29.560 "immediate_data_pool_size": 16384, 00:06:29.560 "data_out_pool_size": 2048 00:06:29.560 } 00:06:29.560 } 00:06:29.560 ] 00:06:29.560 } 00:06:29.560 ] 00:06:29.560 } 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 911705 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 911705 ']' 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 911705 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 911705 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 911705' 00:06:29.560 killing process with pid 911705 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 911705 00:06:29.560 10:01:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 911705 00:06:29.821 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=912011 00:06:29.821 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:29.821 10:01:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 912011 ']' 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 912011' 00:06:35.102 killing process with pid 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 912011 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:35.102 00:06:35.102 real 0m6.586s 00:06:35.102 user 0m6.487s 00:06:35.102 sys 0m0.561s 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:35.102 10:01:56 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:35.102 ************************************ 00:06:35.102 END TEST skip_rpc_with_json 00:06:35.102 ************************************ 00:06:35.102 10:01:56 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:35.102 10:01:56 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:35.102 10:01:56 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:35.102 10:01:56 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.102 ************************************ 00:06:35.102 START TEST skip_rpc_with_delay 00:06:35.102 ************************************ 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_delay 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@649 -- # local es=0 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.103 [2024-06-10 10:01:56.947466] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:35.103 [2024-06-10 10:01:56.947551] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # es=1 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:35.103 00:06:35.103 real 0m0.079s 00:06:35.103 user 0m0.048s 00:06:35.103 sys 0m0.030s 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:35.103 10:01:56 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:35.103 ************************************ 00:06:35.103 END TEST skip_rpc_with_delay 00:06:35.103 ************************************ 00:06:35.363 10:01:56 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:35.363 10:01:57 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:35.363 10:01:57 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:35.363 10:01:57 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:35.363 10:01:57 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:35.363 10:01:57 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.363 ************************************ 00:06:35.363 START TEST exit_on_failed_rpc_init 00:06:35.363 ************************************ 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # test_exit_on_failed_rpc_init 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=912980 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 912980 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@830 -- # '[' -z 912980 ']' 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:35.363 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:35.363 [2024-06-10 10:01:57.117061] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:35.363 [2024-06-10 10:01:57.117116] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid912980 ] 00:06:35.363 [2024-06-10 10:01:57.207101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.623 [2024-06-10 10:01:57.274872] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@863 -- # return 0 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@649 -- # local es=0 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:36.193 10:01:57 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:36.193 [2024-06-10 10:01:58.011518] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:36.193 [2024-06-10 10:01:58.011566] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid913096 ] 00:06:36.453 [2024-06-10 10:01:58.084603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.453 [2024-06-10 10:01:58.152855] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:36.453 [2024-06-10 10:01:58.152921] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:36.453 [2024-06-10 10:01:58.152932] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:36.453 [2024-06-10 10:01:58.152938] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # es=234 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # es=106 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # case "$es" in 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@669 -- # es=1 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 912980 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@949 -- # '[' -z 912980 ']' 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # kill -0 912980 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # uname 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 912980 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # echo 'killing process with pid 912980' 00:06:36.453 killing process with pid 912980 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # kill 912980 00:06:36.453 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # wait 912980 00:06:36.713 00:06:36.713 real 0m1.429s 00:06:36.713 user 0m1.718s 00:06:36.713 sys 0m0.398s 00:06:36.713 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:36.713 10:01:58 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:36.713 ************************************ 00:06:36.713 END TEST exit_on_failed_rpc_init 00:06:36.713 ************************************ 00:06:36.713 10:01:58 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:36.713 00:06:36.713 real 0m13.755s 00:06:36.713 user 0m13.416s 00:06:36.713 sys 0m1.525s 00:06:36.713 10:01:58 skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:36.713 10:01:58 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.713 ************************************ 00:06:36.713 END TEST skip_rpc 00:06:36.713 ************************************ 00:06:36.713 10:01:58 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:36.713 10:01:58 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:36.713 10:01:58 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:36.713 10:01:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.974 ************************************ 00:06:36.974 START TEST rpc_client 00:06:36.974 ************************************ 00:06:36.974 10:01:58 rpc_client -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:36.974 * Looking for test storage... 00:06:36.974 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:36.974 10:01:58 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:36.974 OK 00:06:36.974 10:01:58 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:36.974 00:06:36.974 real 0m0.128s 00:06:36.974 user 0m0.062s 00:06:36.974 sys 0m0.073s 00:06:36.974 10:01:58 rpc_client -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:36.974 10:01:58 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:36.974 ************************************ 00:06:36.974 END TEST rpc_client 00:06:36.974 ************************************ 00:06:36.974 10:01:58 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:36.974 10:01:58 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:36.974 10:01:58 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:36.974 10:01:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.974 ************************************ 00:06:36.974 START TEST json_config 00:06:36.974 ************************************ 00:06:36.974 10:01:58 json_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:37.235 10:01:58 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:37.235 10:01:58 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:37.235 10:01:58 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:37.235 10:01:58 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.235 10:01:58 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.235 10:01:58 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.235 10:01:58 json_config -- paths/export.sh@5 -- # export PATH 00:06:37.235 10:01:58 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@47 -- # : 0 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:37.235 10:01:58 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:37.235 INFO: JSON configuration test init 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:37.235 10:01:58 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:37.235 10:01:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.235 10:01:58 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.236 10:01:58 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:37.236 10:01:58 json_config -- json_config/common.sh@9 -- # local app=target 00:06:37.236 10:01:58 json_config -- json_config/common.sh@10 -- # shift 00:06:37.236 10:01:58 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:37.236 10:01:58 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:37.236 10:01:58 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:37.236 10:01:58 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:37.236 10:01:58 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:37.236 10:01:58 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=913405 00:06:37.236 10:01:58 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:37.236 Waiting for target to run... 00:06:37.236 10:01:58 json_config -- json_config/common.sh@25 -- # waitforlisten 913405 /var/tmp/spdk_tgt.sock 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@830 -- # '[' -z 913405 ']' 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:37.236 10:01:58 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:37.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:37.236 10:01:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.236 [2024-06-10 10:01:58.973023] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:37.236 [2024-06-10 10:01:58.973071] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid913405 ] 00:06:37.496 [2024-06-10 10:01:59.278586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.496 [2024-06-10 10:01:59.328373] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.075 10:01:59 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:38.075 10:01:59 json_config -- common/autotest_common.sh@863 -- # return 0 00:06:38.075 10:01:59 json_config -- json_config/common.sh@26 -- # echo '' 00:06:38.075 00:06:38.075 10:01:59 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:38.075 10:01:59 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:38.075 10:01:59 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:38.075 10:01:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.076 10:01:59 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:38.076 10:01:59 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:38.076 10:01:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:38.335 10:01:59 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:38.335 10:01:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:38.335 [2024-06-10 10:02:00.178917] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:38.335 10:02:00 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:38.336 10:02:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:38.596 [2024-06-10 10:02:00.375386] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:38.596 10:02:00 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:38.596 10:02:00 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:38.596 10:02:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.596 10:02:00 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:38.596 10:02:00 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:38.596 10:02:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:38.857 [2024-06-10 10:02:00.591705] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:44.142 10:02:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:44.142 10:02:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:44.142 10:02:05 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:44.142 10:02:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:44.403 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:44.403 Nvme0n1p0 Nvme0n1p1 00:06:44.403 10:02:06 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:44.403 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:44.663 [2024-06-10 10:02:06.395835] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.663 [2024-06-10 10:02:06.395882] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.663 00:06:44.663 10:02:06 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:44.663 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:44.923 Malloc3 00:06:44.923 10:02:06 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.923 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.923 [2024-06-10 10:02:06.772858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:44.923 [2024-06-10 10:02:06.772893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:44.923 [2024-06-10 10:02:06.772907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a8610 00:06:44.923 [2024-06-10 10:02:06.772914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:44.923 [2024-06-10 10:02:06.774149] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:44.923 [2024-06-10 10:02:06.774168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:44.923 PTBdevFromMalloc3 00:06:44.923 10:02:06 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:44.923 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:45.183 Null0 00:06:45.184 10:02:06 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:45.184 10:02:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:45.444 Malloc0 00:06:45.444 10:02:07 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:45.444 10:02:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:45.704 Malloc1 00:06:45.704 10:02:07 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:45.704 10:02:07 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:45.704 102400+0 records in 00:06:45.704 102400+0 records out 00:06:45.704 104857600 bytes (105 MB, 100 MiB) copied, 0.102341 s, 1.0 GB/s 00:06:45.704 10:02:07 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:45.704 10:02:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:45.964 aio_disk 00:06:45.964 10:02:07 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:45.964 10:02:07 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:45.964 10:02:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:50.237 e95ff55b-c3d2-430e-a722-518307dc0816 00:06:50.237 10:02:11 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:50.237 10:02:11 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:50.237 10:02:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:50.237 10:02:11 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:50.237 10:02:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:50.237 10:02:12 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:50.237 10:02:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:50.497 10:02:12 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.497 10:02:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.757 10:02:12 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:50.757 10:02:12 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.757 10:02:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.757 MallocForCryptoBdev 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:51.017 10:02:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:51.017 [2024-06-10 10:02:12.825942] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:51.017 CryptoMallocBdev 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@71 -- # sort 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@72 -- # sort 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:51.017 10:02:12 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:51.017 10:02:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:51.277 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\7\d\3\a\d\2\2\-\c\f\c\a\-\4\2\e\6\-\8\f\3\a\-\9\a\b\8\4\c\5\9\6\7\1\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\f\2\5\d\c\e\0\-\2\6\c\6\-\4\9\5\b\-\a\4\f\a\-\6\6\7\9\a\0\9\6\8\6\b\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\7\7\f\1\c\8\a\-\e\a\6\1\-\4\4\d\f\-\8\9\3\7\-\7\b\6\4\0\0\c\7\8\0\4\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\7\7\d\4\f\1\0\-\2\e\7\4\-\4\d\5\3\-\9\7\1\f\-\7\2\b\c\4\0\3\7\6\f\8\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@86 -- # cat 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:51.278 Expected events matched: 00:06:51.278 bdev_register:77d3ad22-cfca-42e6-8f3a-9ab84c59671c 00:06:51.278 bdev_register:7f25dce0-26c6-495b-a4fa-6679a09686b2 00:06:51.278 bdev_register:977f1c8a-ea61-44df-8937-7b6400c7804b 00:06:51.278 bdev_register:a77d4f10-2e74-4d53-971f-72bc40376f85 00:06:51.278 bdev_register:aio_disk 00:06:51.278 bdev_register:CryptoMallocBdev 00:06:51.278 bdev_register:Malloc0 00:06:51.278 bdev_register:Malloc0p0 00:06:51.278 bdev_register:Malloc0p1 00:06:51.278 bdev_register:Malloc0p2 00:06:51.278 bdev_register:Malloc1 00:06:51.278 bdev_register:Malloc3 00:06:51.278 bdev_register:MallocForCryptoBdev 00:06:51.278 bdev_register:Null0 00:06:51.278 bdev_register:Nvme0n1 00:06:51.278 bdev_register:Nvme0n1p0 00:06:51.278 bdev_register:Nvme0n1p1 00:06:51.278 bdev_register:PTBdevFromMalloc3 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:51.278 10:02:13 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:51.278 10:02:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:51.278 10:02:13 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:51.278 10:02:13 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:51.278 10:02:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.539 10:02:13 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:51.539 10:02:13 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.539 10:02:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.539 MallocBdevForConfigChangeCheck 00:06:51.539 10:02:13 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:51.539 10:02:13 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:51.539 10:02:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.539 10:02:13 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:51.539 10:02:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:52.108 10:02:13 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:52.108 INFO: shutting down applications... 00:06:52.108 10:02:13 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:52.108 10:02:13 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:52.108 10:02:13 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:52.108 10:02:13 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:52.109 [2024-06-10 10:02:13.868998] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:54.650 Calling clear_iscsi_subsystem 00:06:54.650 Calling clear_nvmf_subsystem 00:06:54.650 Calling clear_nbd_subsystem 00:06:54.650 Calling clear_ublk_subsystem 00:06:54.650 Calling clear_vhost_blk_subsystem 00:06:54.650 Calling clear_vhost_scsi_subsystem 00:06:54.650 Calling clear_bdev_subsystem 00:06:54.650 10:02:16 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:54.650 10:02:16 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:54.650 10:02:16 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:54.651 10:02:16 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:54.651 10:02:16 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:54.651 10:02:16 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:55.221 10:02:16 json_config -- json_config/json_config.sh@345 -- # break 00:06:55.222 10:02:16 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:55.222 10:02:16 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:55.222 10:02:16 json_config -- json_config/common.sh@31 -- # local app=target 00:06:55.222 10:02:16 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:55.222 10:02:16 json_config -- json_config/common.sh@35 -- # [[ -n 913405 ]] 00:06:55.222 10:02:16 json_config -- json_config/common.sh@38 -- # kill -SIGINT 913405 00:06:55.222 10:02:16 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:55.222 10:02:16 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:55.222 10:02:16 json_config -- json_config/common.sh@41 -- # kill -0 913405 00:06:55.222 10:02:16 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:55.482 10:02:17 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:55.482 10:02:17 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:55.482 10:02:17 json_config -- json_config/common.sh@41 -- # kill -0 913405 00:06:55.482 10:02:17 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:55.482 10:02:17 json_config -- json_config/common.sh@43 -- # break 00:06:55.482 10:02:17 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:55.482 10:02:17 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:55.482 SPDK target shutdown done 00:06:55.482 10:02:17 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:55.482 INFO: relaunching applications... 00:06:55.482 10:02:17 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.482 10:02:17 json_config -- json_config/common.sh@9 -- # local app=target 00:06:55.482 10:02:17 json_config -- json_config/common.sh@10 -- # shift 00:06:55.482 10:02:17 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:55.482 10:02:17 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:55.482 10:02:17 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:55.482 10:02:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:55.482 10:02:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:55.482 10:02:17 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=916625 00:06:55.482 10:02:17 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:55.482 Waiting for target to run... 00:06:55.482 10:02:17 json_config -- json_config/common.sh@25 -- # waitforlisten 916625 /var/tmp/spdk_tgt.sock 00:06:55.482 10:02:17 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@830 -- # '[' -z 916625 ']' 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:55.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:55.482 10:02:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.743 [2024-06-10 10:02:17.359799] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:06:55.743 [2024-06-10 10:02:17.359858] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid916625 ] 00:06:56.004 [2024-06-10 10:02:17.782332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.004 [2024-06-10 10:02:17.838325] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.265 [2024-06-10 10:02:17.892078] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:56.265 [2024-06-10 10:02:17.900114] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:56.265 [2024-06-10 10:02:17.908129] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:56.265 [2024-06-10 10:02:17.988573] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:58.814 [2024-06-10 10:02:20.123680] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:58.814 [2024-06-10 10:02:20.123724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:58.814 [2024-06-10 10:02:20.123732] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:58.814 [2024-06-10 10:02:20.131698] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:58.814 [2024-06-10 10:02:20.131715] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:58.814 [2024-06-10 10:02:20.139711] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:58.814 [2024-06-10 10:02:20.139726] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:58.814 [2024-06-10 10:02:20.147741] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:58.814 [2024-06-10 10:02:20.147757] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:58.814 [2024-06-10 10:02:20.147764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:01.359 [2024-06-10 10:02:23.001667] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:01.359 [2024-06-10 10:02:23.001702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:01.359 [2024-06-10 10:02:23.001712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4c950 00:07:01.359 [2024-06-10 10:02:23.001718] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:01.359 [2024-06-10 10:02:23.001959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:01.359 [2024-06-10 10:02:23.001971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:01.621 10:02:23 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:01.621 10:02:23 json_config -- common/autotest_common.sh@863 -- # return 0 00:07:01.621 10:02:23 json_config -- json_config/common.sh@26 -- # echo '' 00:07:01.621 00:07:01.621 10:02:23 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:01.621 10:02:23 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:01.621 INFO: Checking if target configuration is the same... 00:07:01.621 10:02:23 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:01.621 10:02:23 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.621 10:02:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:01.621 + '[' 2 -ne 2 ']' 00:07:01.621 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:01.621 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:01.621 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:01.621 +++ basename /dev/fd/62 00:07:01.621 ++ mktemp /tmp/62.XXX 00:07:01.621 + tmp_file_1=/tmp/62.R1E 00:07:01.621 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.621 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:01.621 + tmp_file_2=/tmp/spdk_tgt_config.json.jAO 00:07:01.621 + ret=0 00:07:01.621 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.882 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.882 + diff -u /tmp/62.R1E /tmp/spdk_tgt_config.json.jAO 00:07:01.882 + echo 'INFO: JSON config files are the same' 00:07:01.882 INFO: JSON config files are the same 00:07:01.882 + rm /tmp/62.R1E /tmp/spdk_tgt_config.json.jAO 00:07:01.882 + exit 0 00:07:01.882 10:02:23 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:01.883 10:02:23 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:01.883 INFO: changing configuration and checking if this can be detected... 00:07:01.883 10:02:23 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:01.883 10:02:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:02.143 10:02:23 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:02.143 10:02:23 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:02.143 10:02:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:02.143 + '[' 2 -ne 2 ']' 00:07:02.143 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:02.143 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:02.143 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:02.143 +++ basename /dev/fd/62 00:07:02.143 ++ mktemp /tmp/62.XXX 00:07:02.143 + tmp_file_1=/tmp/62.E6E 00:07:02.143 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:02.143 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:02.143 + tmp_file_2=/tmp/spdk_tgt_config.json.A9n 00:07:02.143 + ret=0 00:07:02.143 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:02.404 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:02.404 + diff -u /tmp/62.E6E /tmp/spdk_tgt_config.json.A9n 00:07:02.404 + ret=1 00:07:02.404 + echo '=== Start of file: /tmp/62.E6E ===' 00:07:02.404 + cat /tmp/62.E6E 00:07:02.404 + echo '=== End of file: /tmp/62.E6E ===' 00:07:02.404 + echo '' 00:07:02.404 + echo '=== Start of file: /tmp/spdk_tgt_config.json.A9n ===' 00:07:02.404 + cat /tmp/spdk_tgt_config.json.A9n 00:07:02.404 + echo '=== End of file: /tmp/spdk_tgt_config.json.A9n ===' 00:07:02.404 + echo '' 00:07:02.404 + rm /tmp/62.E6E /tmp/spdk_tgt_config.json.A9n 00:07:02.404 + exit 1 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:02.404 INFO: configuration change detected. 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:02.404 10:02:24 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:07:02.404 10:02:24 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@317 -- # [[ -n 916625 ]] 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:02.404 10:02:24 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:02.404 10:02:24 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:07:02.404 10:02:24 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.665 10:02:24 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:02.665 10:02:24 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:02.665 10:02:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:02.665 10:02:24 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:02.665 10:02:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:02.926 10:02:24 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:02.926 10:02:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:03.187 10:02:24 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:03.187 10:02:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:03.187 10:02:25 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:03.187 10:02:25 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:03.187 10:02:25 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:03.187 10:02:25 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:03.187 10:02:25 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:03.187 10:02:25 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:07:03.187 10:02:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:03.448 10:02:25 json_config -- json_config/json_config.sh@323 -- # killprocess 916625 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@949 -- # '[' -z 916625 ']' 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@953 -- # kill -0 916625 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@954 -- # uname 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 916625 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@967 -- # echo 'killing process with pid 916625' 00:07:03.448 killing process with pid 916625 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@968 -- # kill 916625 00:07:03.448 10:02:25 json_config -- common/autotest_common.sh@973 -- # wait 916625 00:07:05.995 10:02:27 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:05.995 10:02:27 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:05.995 10:02:27 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:07:05.995 10:02:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.995 10:02:27 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:05.995 10:02:27 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:05.995 INFO: Success 00:07:05.995 00:07:05.995 real 0m28.950s 00:07:05.995 user 0m32.844s 00:07:05.995 sys 0m2.732s 00:07:05.995 10:02:27 json_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:05.995 10:02:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.995 ************************************ 00:07:05.995 END TEST json_config 00:07:05.995 ************************************ 00:07:05.995 10:02:27 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:05.995 10:02:27 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:05.995 10:02:27 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:05.995 10:02:27 -- common/autotest_common.sh@10 -- # set +x 00:07:05.995 ************************************ 00:07:05.995 START TEST json_config_extra_key 00:07:05.995 ************************************ 00:07:05.995 10:02:27 json_config_extra_key -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:06.256 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:06.256 10:02:27 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:06.256 10:02:27 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:06.256 10:02:27 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:06.256 10:02:27 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.256 10:02:27 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.256 10:02:27 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.256 10:02:27 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:06.256 10:02:27 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:06.256 10:02:27 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:06.256 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:06.257 INFO: launching applications... 00:07:06.257 10:02:27 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=918578 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:06.257 Waiting for target to run... 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 918578 /var/tmp/spdk_tgt.sock 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@830 -- # '[' -z 918578 ']' 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:06.257 10:02:27 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:06.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:06.257 10:02:27 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:06.257 [2024-06-10 10:02:27.977749] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:06.257 [2024-06-10 10:02:27.977810] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid918578 ] 00:07:06.518 [2024-06-10 10:02:28.299067] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.518 [2024-06-10 10:02:28.349693] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.089 10:02:28 json_config_extra_key -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:07.089 10:02:28 json_config_extra_key -- common/autotest_common.sh@863 -- # return 0 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:07.089 00:07:07.089 10:02:28 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:07.089 INFO: shutting down applications... 00:07:07.089 10:02:28 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 918578 ]] 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 918578 00:07:07.089 10:02:28 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:07.090 10:02:28 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:07.090 10:02:28 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 918578 00:07:07.090 10:02:28 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:07.661 10:02:29 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 918578 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:07.662 10:02:29 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:07.662 SPDK target shutdown done 00:07:07.662 10:02:29 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:07.662 Success 00:07:07.662 00:07:07.662 real 0m1.501s 00:07:07.662 user 0m1.081s 00:07:07.662 sys 0m0.413s 00:07:07.662 10:02:29 json_config_extra_key -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:07.662 10:02:29 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:07.662 ************************************ 00:07:07.662 END TEST json_config_extra_key 00:07:07.662 ************************************ 00:07:07.662 10:02:29 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:07.662 10:02:29 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:07.662 10:02:29 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:07.662 10:02:29 -- common/autotest_common.sh@10 -- # set +x 00:07:07.662 ************************************ 00:07:07.662 START TEST alias_rpc 00:07:07.662 ************************************ 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:07.662 * Looking for test storage... 00:07:07.662 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:07.662 10:02:29 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:07.662 10:02:29 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=918927 00:07:07.662 10:02:29 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 918927 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@830 -- # '[' -z 918927 ']' 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:07.662 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:07.662 10:02:29 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:07.662 10:02:29 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.923 [2024-06-10 10:02:29.552409] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:07.923 [2024-06-10 10:02:29.552460] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid918927 ] 00:07:07.923 [2024-06-10 10:02:29.641414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.923 [2024-06-10 10:02:29.706085] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.863 10:02:30 alias_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:08.863 10:02:30 alias_rpc -- common/autotest_common.sh@863 -- # return 0 00:07:08.863 10:02:30 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:08.864 10:02:30 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 918927 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@949 -- # '[' -z 918927 ']' 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@953 -- # kill -0 918927 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@954 -- # uname 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 918927 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 918927' 00:07:08.864 killing process with pid 918927 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@968 -- # kill 918927 00:07:08.864 10:02:30 alias_rpc -- common/autotest_common.sh@973 -- # wait 918927 00:07:09.124 00:07:09.124 real 0m1.445s 00:07:09.124 user 0m1.638s 00:07:09.124 sys 0m0.386s 00:07:09.124 10:02:30 alias_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:09.124 10:02:30 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:09.124 ************************************ 00:07:09.124 END TEST alias_rpc 00:07:09.124 ************************************ 00:07:09.124 10:02:30 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:09.124 10:02:30 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:09.124 10:02:30 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:09.124 10:02:30 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:09.124 10:02:30 -- common/autotest_common.sh@10 -- # set +x 00:07:09.125 ************************************ 00:07:09.125 START TEST spdkcli_tcp 00:07:09.125 ************************************ 00:07:09.125 10:02:30 spdkcli_tcp -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:09.385 * Looking for test storage... 00:07:09.385 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@723 -- # xtrace_disable 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=919290 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 919290 00:07:09.385 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@830 -- # '[' -z 919290 ']' 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:09.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:09.385 10:02:31 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:09.385 [2024-06-10 10:02:31.088552] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:09.385 [2024-06-10 10:02:31.088619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid919290 ] 00:07:09.385 [2024-06-10 10:02:31.182413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:09.645 [2024-06-10 10:02:31.251297] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.645 [2024-06-10 10:02:31.251302] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.216 10:02:31 spdkcli_tcp -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:10.216 10:02:31 spdkcli_tcp -- common/autotest_common.sh@863 -- # return 0 00:07:10.216 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=919321 00:07:10.216 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:10.216 10:02:31 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:10.216 [ 00:07:10.216 "bdev_malloc_delete", 00:07:10.216 "bdev_malloc_create", 00:07:10.216 "bdev_null_resize", 00:07:10.216 "bdev_null_delete", 00:07:10.216 "bdev_null_create", 00:07:10.216 "bdev_nvme_cuse_unregister", 00:07:10.216 "bdev_nvme_cuse_register", 00:07:10.216 "bdev_opal_new_user", 00:07:10.216 "bdev_opal_set_lock_state", 00:07:10.216 "bdev_opal_delete", 00:07:10.216 "bdev_opal_get_info", 00:07:10.216 "bdev_opal_create", 00:07:10.216 "bdev_nvme_opal_revert", 00:07:10.216 "bdev_nvme_opal_init", 00:07:10.216 "bdev_nvme_send_cmd", 00:07:10.216 "bdev_nvme_get_path_iostat", 00:07:10.216 "bdev_nvme_get_mdns_discovery_info", 00:07:10.216 "bdev_nvme_stop_mdns_discovery", 00:07:10.216 "bdev_nvme_start_mdns_discovery", 00:07:10.216 "bdev_nvme_set_multipath_policy", 00:07:10.216 "bdev_nvme_set_preferred_path", 00:07:10.216 "bdev_nvme_get_io_paths", 00:07:10.216 "bdev_nvme_remove_error_injection", 00:07:10.216 "bdev_nvme_add_error_injection", 00:07:10.216 "bdev_nvme_get_discovery_info", 00:07:10.216 "bdev_nvme_stop_discovery", 00:07:10.216 "bdev_nvme_start_discovery", 00:07:10.216 "bdev_nvme_get_controller_health_info", 00:07:10.216 "bdev_nvme_disable_controller", 00:07:10.216 "bdev_nvme_enable_controller", 00:07:10.216 "bdev_nvme_reset_controller", 00:07:10.216 "bdev_nvme_get_transport_statistics", 00:07:10.216 "bdev_nvme_apply_firmware", 00:07:10.216 "bdev_nvme_detach_controller", 00:07:10.216 "bdev_nvme_get_controllers", 00:07:10.216 "bdev_nvme_attach_controller", 00:07:10.216 "bdev_nvme_set_hotplug", 00:07:10.216 "bdev_nvme_set_options", 00:07:10.216 "bdev_passthru_delete", 00:07:10.216 "bdev_passthru_create", 00:07:10.216 "bdev_lvol_set_parent_bdev", 00:07:10.216 "bdev_lvol_set_parent", 00:07:10.216 "bdev_lvol_check_shallow_copy", 00:07:10.216 "bdev_lvol_start_shallow_copy", 00:07:10.216 "bdev_lvol_grow_lvstore", 00:07:10.216 "bdev_lvol_get_lvols", 00:07:10.216 "bdev_lvol_get_lvstores", 00:07:10.216 "bdev_lvol_delete", 00:07:10.216 "bdev_lvol_set_read_only", 00:07:10.216 "bdev_lvol_resize", 00:07:10.216 "bdev_lvol_decouple_parent", 00:07:10.216 "bdev_lvol_inflate", 00:07:10.216 "bdev_lvol_rename", 00:07:10.216 "bdev_lvol_clone_bdev", 00:07:10.216 "bdev_lvol_clone", 00:07:10.216 "bdev_lvol_snapshot", 00:07:10.216 "bdev_lvol_create", 00:07:10.216 "bdev_lvol_delete_lvstore", 00:07:10.216 "bdev_lvol_rename_lvstore", 00:07:10.216 "bdev_lvol_create_lvstore", 00:07:10.216 "bdev_raid_set_options", 00:07:10.216 "bdev_raid_remove_base_bdev", 00:07:10.216 "bdev_raid_add_base_bdev", 00:07:10.216 "bdev_raid_delete", 00:07:10.216 "bdev_raid_create", 00:07:10.216 "bdev_raid_get_bdevs", 00:07:10.216 "bdev_error_inject_error", 00:07:10.216 "bdev_error_delete", 00:07:10.216 "bdev_error_create", 00:07:10.216 "bdev_split_delete", 00:07:10.216 "bdev_split_create", 00:07:10.216 "bdev_delay_delete", 00:07:10.216 "bdev_delay_create", 00:07:10.216 "bdev_delay_update_latency", 00:07:10.216 "bdev_zone_block_delete", 00:07:10.216 "bdev_zone_block_create", 00:07:10.216 "blobfs_create", 00:07:10.216 "blobfs_detect", 00:07:10.216 "blobfs_set_cache_size", 00:07:10.216 "bdev_crypto_delete", 00:07:10.216 "bdev_crypto_create", 00:07:10.216 "bdev_compress_delete", 00:07:10.216 "bdev_compress_create", 00:07:10.216 "bdev_compress_get_orphans", 00:07:10.216 "bdev_aio_delete", 00:07:10.216 "bdev_aio_rescan", 00:07:10.216 "bdev_aio_create", 00:07:10.216 "bdev_ftl_set_property", 00:07:10.216 "bdev_ftl_get_properties", 00:07:10.216 "bdev_ftl_get_stats", 00:07:10.216 "bdev_ftl_unmap", 00:07:10.216 "bdev_ftl_unload", 00:07:10.216 "bdev_ftl_delete", 00:07:10.216 "bdev_ftl_load", 00:07:10.216 "bdev_ftl_create", 00:07:10.216 "bdev_virtio_attach_controller", 00:07:10.216 "bdev_virtio_scsi_get_devices", 00:07:10.216 "bdev_virtio_detach_controller", 00:07:10.216 "bdev_virtio_blk_set_hotplug", 00:07:10.216 "bdev_iscsi_delete", 00:07:10.216 "bdev_iscsi_create", 00:07:10.216 "bdev_iscsi_set_options", 00:07:10.216 "accel_error_inject_error", 00:07:10.216 "ioat_scan_accel_module", 00:07:10.216 "dsa_scan_accel_module", 00:07:10.216 "iaa_scan_accel_module", 00:07:10.216 "dpdk_cryptodev_get_driver", 00:07:10.216 "dpdk_cryptodev_set_driver", 00:07:10.216 "dpdk_cryptodev_scan_accel_module", 00:07:10.216 "compressdev_scan_accel_module", 00:07:10.216 "keyring_file_remove_key", 00:07:10.216 "keyring_file_add_key", 00:07:10.216 "keyring_linux_set_options", 00:07:10.216 "iscsi_get_histogram", 00:07:10.216 "iscsi_enable_histogram", 00:07:10.216 "iscsi_set_options", 00:07:10.216 "iscsi_get_auth_groups", 00:07:10.216 "iscsi_auth_group_remove_secret", 00:07:10.216 "iscsi_auth_group_add_secret", 00:07:10.216 "iscsi_delete_auth_group", 00:07:10.216 "iscsi_create_auth_group", 00:07:10.216 "iscsi_set_discovery_auth", 00:07:10.216 "iscsi_get_options", 00:07:10.216 "iscsi_target_node_request_logout", 00:07:10.216 "iscsi_target_node_set_redirect", 00:07:10.216 "iscsi_target_node_set_auth", 00:07:10.216 "iscsi_target_node_add_lun", 00:07:10.216 "iscsi_get_stats", 00:07:10.216 "iscsi_get_connections", 00:07:10.216 "iscsi_portal_group_set_auth", 00:07:10.216 "iscsi_start_portal_group", 00:07:10.216 "iscsi_delete_portal_group", 00:07:10.216 "iscsi_create_portal_group", 00:07:10.216 "iscsi_get_portal_groups", 00:07:10.216 "iscsi_delete_target_node", 00:07:10.216 "iscsi_target_node_remove_pg_ig_maps", 00:07:10.216 "iscsi_target_node_add_pg_ig_maps", 00:07:10.216 "iscsi_create_target_node", 00:07:10.216 "iscsi_get_target_nodes", 00:07:10.216 "iscsi_delete_initiator_group", 00:07:10.216 "iscsi_initiator_group_remove_initiators", 00:07:10.216 "iscsi_initiator_group_add_initiators", 00:07:10.216 "iscsi_create_initiator_group", 00:07:10.216 "iscsi_get_initiator_groups", 00:07:10.216 "nvmf_set_crdt", 00:07:10.216 "nvmf_set_config", 00:07:10.216 "nvmf_set_max_subsystems", 00:07:10.216 "nvmf_stop_mdns_prr", 00:07:10.216 "nvmf_publish_mdns_prr", 00:07:10.216 "nvmf_subsystem_get_listeners", 00:07:10.216 "nvmf_subsystem_get_qpairs", 00:07:10.216 "nvmf_subsystem_get_controllers", 00:07:10.216 "nvmf_get_stats", 00:07:10.216 "nvmf_get_transports", 00:07:10.216 "nvmf_create_transport", 00:07:10.216 "nvmf_get_targets", 00:07:10.216 "nvmf_delete_target", 00:07:10.216 "nvmf_create_target", 00:07:10.216 "nvmf_subsystem_allow_any_host", 00:07:10.216 "nvmf_subsystem_remove_host", 00:07:10.216 "nvmf_subsystem_add_host", 00:07:10.216 "nvmf_ns_remove_host", 00:07:10.217 "nvmf_ns_add_host", 00:07:10.217 "nvmf_subsystem_remove_ns", 00:07:10.217 "nvmf_subsystem_add_ns", 00:07:10.217 "nvmf_subsystem_listener_set_ana_state", 00:07:10.217 "nvmf_discovery_get_referrals", 00:07:10.217 "nvmf_discovery_remove_referral", 00:07:10.217 "nvmf_discovery_add_referral", 00:07:10.217 "nvmf_subsystem_remove_listener", 00:07:10.217 "nvmf_subsystem_add_listener", 00:07:10.217 "nvmf_delete_subsystem", 00:07:10.217 "nvmf_create_subsystem", 00:07:10.217 "nvmf_get_subsystems", 00:07:10.217 "env_dpdk_get_mem_stats", 00:07:10.217 "nbd_get_disks", 00:07:10.217 "nbd_stop_disk", 00:07:10.217 "nbd_start_disk", 00:07:10.217 "ublk_recover_disk", 00:07:10.217 "ublk_get_disks", 00:07:10.217 "ublk_stop_disk", 00:07:10.217 "ublk_start_disk", 00:07:10.217 "ublk_destroy_target", 00:07:10.217 "ublk_create_target", 00:07:10.217 "virtio_blk_create_transport", 00:07:10.217 "virtio_blk_get_transports", 00:07:10.217 "vhost_controller_set_coalescing", 00:07:10.217 "vhost_get_controllers", 00:07:10.217 "vhost_delete_controller", 00:07:10.217 "vhost_create_blk_controller", 00:07:10.217 "vhost_scsi_controller_remove_target", 00:07:10.217 "vhost_scsi_controller_add_target", 00:07:10.217 "vhost_start_scsi_controller", 00:07:10.217 "vhost_create_scsi_controller", 00:07:10.217 "thread_set_cpumask", 00:07:10.217 "framework_get_scheduler", 00:07:10.217 "framework_set_scheduler", 00:07:10.217 "framework_get_reactors", 00:07:10.217 "thread_get_io_channels", 00:07:10.217 "thread_get_pollers", 00:07:10.217 "thread_get_stats", 00:07:10.217 "framework_monitor_context_switch", 00:07:10.217 "spdk_kill_instance", 00:07:10.217 "log_enable_timestamps", 00:07:10.217 "log_get_flags", 00:07:10.217 "log_clear_flag", 00:07:10.217 "log_set_flag", 00:07:10.217 "log_get_level", 00:07:10.217 "log_set_level", 00:07:10.217 "log_get_print_level", 00:07:10.217 "log_set_print_level", 00:07:10.217 "framework_enable_cpumask_locks", 00:07:10.217 "framework_disable_cpumask_locks", 00:07:10.217 "framework_wait_init", 00:07:10.217 "framework_start_init", 00:07:10.217 "scsi_get_devices", 00:07:10.217 "bdev_get_histogram", 00:07:10.217 "bdev_enable_histogram", 00:07:10.217 "bdev_set_qos_limit", 00:07:10.217 "bdev_set_qd_sampling_period", 00:07:10.217 "bdev_get_bdevs", 00:07:10.217 "bdev_reset_iostat", 00:07:10.217 "bdev_get_iostat", 00:07:10.217 "bdev_examine", 00:07:10.217 "bdev_wait_for_examine", 00:07:10.217 "bdev_set_options", 00:07:10.217 "notify_get_notifications", 00:07:10.217 "notify_get_types", 00:07:10.217 "accel_get_stats", 00:07:10.217 "accel_set_options", 00:07:10.217 "accel_set_driver", 00:07:10.217 "accel_crypto_key_destroy", 00:07:10.217 "accel_crypto_keys_get", 00:07:10.217 "accel_crypto_key_create", 00:07:10.217 "accel_assign_opc", 00:07:10.217 "accel_get_module_info", 00:07:10.217 "accel_get_opc_assignments", 00:07:10.217 "vmd_rescan", 00:07:10.217 "vmd_remove_device", 00:07:10.217 "vmd_enable", 00:07:10.217 "sock_get_default_impl", 00:07:10.217 "sock_set_default_impl", 00:07:10.217 "sock_impl_set_options", 00:07:10.217 "sock_impl_get_options", 00:07:10.217 "iobuf_get_stats", 00:07:10.217 "iobuf_set_options", 00:07:10.217 "framework_get_pci_devices", 00:07:10.217 "framework_get_config", 00:07:10.217 "framework_get_subsystems", 00:07:10.217 "trace_get_info", 00:07:10.217 "trace_get_tpoint_group_mask", 00:07:10.217 "trace_disable_tpoint_group", 00:07:10.217 "trace_enable_tpoint_group", 00:07:10.217 "trace_clear_tpoint_mask", 00:07:10.217 "trace_set_tpoint_mask", 00:07:10.217 "keyring_get_keys", 00:07:10.217 "spdk_get_version", 00:07:10.217 "rpc_get_methods" 00:07:10.217 ] 00:07:10.476 10:02:32 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@729 -- # xtrace_disable 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.476 10:02:32 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:10.476 10:02:32 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 919290 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@949 -- # '[' -z 919290 ']' 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@953 -- # kill -0 919290 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@954 -- # uname 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 919290 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@967 -- # echo 'killing process with pid 919290' 00:07:10.476 killing process with pid 919290 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@968 -- # kill 919290 00:07:10.476 10:02:32 spdkcli_tcp -- common/autotest_common.sh@973 -- # wait 919290 00:07:10.736 00:07:10.736 real 0m1.482s 00:07:10.736 user 0m2.752s 00:07:10.736 sys 0m0.446s 00:07:10.736 10:02:32 spdkcli_tcp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:10.736 10:02:32 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.736 ************************************ 00:07:10.736 END TEST spdkcli_tcp 00:07:10.736 ************************************ 00:07:10.736 10:02:32 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.736 10:02:32 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:10.736 10:02:32 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:10.736 10:02:32 -- common/autotest_common.sh@10 -- # set +x 00:07:10.736 ************************************ 00:07:10.736 START TEST dpdk_mem_utility 00:07:10.736 ************************************ 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.736 * Looking for test storage... 00:07:10.736 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:10.736 10:02:32 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:10.736 10:02:32 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=919602 00:07:10.736 10:02:32 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 919602 00:07:10.736 10:02:32 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@830 -- # '[' -z 919602 ']' 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.736 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:10.736 10:02:32 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:10.996 [2024-06-10 10:02:32.628497] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:10.996 [2024-06-10 10:02:32.628564] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid919602 ] 00:07:10.996 [2024-06-10 10:02:32.719017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.996 [2024-06-10 10:02:32.786464] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.938 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:11.938 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@863 -- # return 0 00:07:11.938 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:11.938 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:11.938 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:11.938 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:11.938 { 00:07:11.938 "filename": "/tmp/spdk_mem_dump.txt" 00:07:11.938 } 00:07:11.938 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:11.938 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:11.938 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:11.938 2 heaps totaling size 816.000000 MiB 00:07:11.938 size: 814.000000 MiB heap id: 0 00:07:11.938 size: 2.000000 MiB heap id: 1 00:07:11.938 end heaps---------- 00:07:11.938 8 mempools totaling size 598.116089 MiB 00:07:11.938 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:11.938 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:11.938 size: 84.521057 MiB name: bdev_io_919602 00:07:11.938 size: 51.011292 MiB name: evtpool_919602 00:07:11.938 size: 50.003479 MiB name: msgpool_919602 00:07:11.938 size: 21.763794 MiB name: PDU_Pool 00:07:11.938 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:11.938 size: 0.026123 MiB name: Session_Pool 00:07:11.938 end mempools------- 00:07:11.938 201 memzones totaling size 4.176453 MiB 00:07:11.938 size: 1.000366 MiB name: RG_ring_0_919602 00:07:11.938 size: 1.000366 MiB name: RG_ring_1_919602 00:07:11.938 size: 1.000366 MiB name: RG_ring_4_919602 00:07:11.938 size: 1.000366 MiB name: RG_ring_5_919602 00:07:11.938 size: 0.125366 MiB name: RG_ring_2_919602 00:07:11.938 size: 0.015991 MiB name: RG_ring_3_919602 00:07:11.938 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:11.938 size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:11.938 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:11.938 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:11.938 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:11.939 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:11.939 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:11.939 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.939 end memzones------- 00:07:11.939 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:11.939 heap id: 0 total size: 814.000000 MiB number of busy elements: 494 number of free elements: 14 00:07:11.939 list of free elements. size: 11.842712 MiB 00:07:11.939 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:11.939 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:11.939 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:11.939 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:11.939 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:11.939 element at address: 0x200007000000 with size: 0.991760 MiB 00:07:11.939 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:11.939 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:11.939 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:11.939 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:11.939 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:11.939 element at address: 0x200000800000 with size: 0.486145 MiB 00:07:11.939 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:11.939 element at address: 0x200027e00000 with size: 0.399597 MiB 00:07:11.939 list of standard malloc elements. size: 199.872437 MiB 00:07:11.939 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:11.939 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:11.939 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:11.939 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:11.939 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:11.939 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:11.939 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:11.939 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:11.939 element at address: 0x20000033b340 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000033e8c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000341e40 with size: 0.004395 MiB 00:07:11.939 element at address: 0x2000003453c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000348940 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000034bec0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000034f440 with size: 0.004395 MiB 00:07:11.939 element at address: 0x2000003529c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000355f40 with size: 0.004395 MiB 00:07:11.939 element at address: 0x2000003594c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000035ca40 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000035ffc0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000363540 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000366ac0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000036a040 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000036d5c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000370b40 with size: 0.004395 MiB 00:07:11.939 element at address: 0x2000003740c0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x200000377640 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000037abc0 with size: 0.004395 MiB 00:07:11.939 element at address: 0x20000037e140 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003816c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x200000384c40 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003881c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x20000038b740 with size: 0.004395 MiB 00:07:11.940 element at address: 0x20000038ecc0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x200000392240 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003957c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x200000398d40 with size: 0.004395 MiB 00:07:11.940 element at address: 0x20000039c2c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x20000039f840 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003a2dc0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003a6340 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003a98c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003ace40 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003b03c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003b3940 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003b6ec0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003ba440 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003bd9c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003c0f40 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003c44c0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003c7a40 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003cafc0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003ce540 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003d1ac0 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003d5040 with size: 0.004395 MiB 00:07:11.940 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:11.940 element at address: 0x200000339240 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000033a2c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000033c7c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000033d840 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000033fd40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000340dc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003432c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000344340 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000346840 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003478c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000349dc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000034ae40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000034d340 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000034e3c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003508c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000351940 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000353e40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000354ec0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003573c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000358440 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000035a940 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000035b9c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000035dec0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000035ef40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000361440 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003624c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003649c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000365a40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000367f40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000368fc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000036b4c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000036c540 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000036ea40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000036fac0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000371fc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000373040 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000375540 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003765c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000378ac0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000379b40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000037c040 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000037d0c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000037f5c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000380640 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000382b40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000383bc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003860c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000387140 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000389640 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000038a6c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000038cbc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000038dc40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000390140 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003936c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000394740 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000396c40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000397cc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000039a1c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000039b240 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000039d740 with size: 0.004028 MiB 00:07:11.940 element at address: 0x20000039e7c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a0cc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a1d40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a52c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a77c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003a8840 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003aad40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003abdc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003ae2c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003af340 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b1840 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b28c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b4dc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b5e40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b8340 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003b93c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003bb8c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003bc940 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003bee40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003bfec0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c23c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c3440 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c5940 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c69c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c8ec0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003c9f40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003cc440 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003cd4c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003cf9c0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003d0a40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003d2f40 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003d3fc0 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:11.940 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:11.940 element at address: 0x200000200000 with size: 0.000305 MiB 00:07:11.940 element at address: 0x20000020ea00 with size: 0.000305 MiB 00:07:11.940 element at address: 0x200000200140 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200200 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200380 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200440 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200500 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200680 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200740 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200800 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200980 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200a40 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200b00 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200c80 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200d40 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000200e00 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002090c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209180 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209240 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209300 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002093c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209480 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209540 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209600 with size: 0.000183 MiB 00:07:11.940 element at address: 0x2000002096c0 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209780 with size: 0.000183 MiB 00:07:11.940 element at address: 0x200000209840 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209900 with size: 0.000183 MiB 00:07:11.941 element at address: 0x2000002099c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209a80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209b40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209c00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209cc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209d80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209e40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209f00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000209fc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a080 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a140 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a200 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a2c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a380 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a440 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a500 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a5c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a680 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a740 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a800 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a8c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020a980 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020aa40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ab00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020abc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ac80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ad40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ae00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020aec0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020af80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b040 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b100 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b1c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b280 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b340 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b400 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b4c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b580 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b640 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b700 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b7c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b880 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020b940 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ba00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bac0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bb80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bc40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bd00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bdc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020be80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020bf40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c000 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c0c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c180 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c240 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c300 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c3c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c480 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c540 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c600 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c6c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c780 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c840 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c900 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020c9c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ca80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020cb40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020cc00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ccc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020cd80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ce40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020cf00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020cfc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d080 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d140 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d200 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d2c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d380 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d440 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d500 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d5c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d680 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d740 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d800 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d8c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020d980 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020da40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020db00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020dbc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020dc80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020dd40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020de00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020dec0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020df80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e040 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e100 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e1c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e280 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e340 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e400 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e4c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e580 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e640 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e700 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e7c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e880 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020e940 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020eb40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ec00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ecc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ed80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ee40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ef00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020efc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f080 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f140 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f200 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f2c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f380 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f440 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f500 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f5c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f680 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f740 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f800 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f8c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020f980 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fa40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fb00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fbc0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fc80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fd40 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fe00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020fec0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x20000020ff80 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210040 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210100 with size: 0.000183 MiB 00:07:11.941 element at address: 0x2000002101c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210280 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210340 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210400 with size: 0.000183 MiB 00:07:11.941 element at address: 0x2000002104c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210580 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210640 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210700 with size: 0.000183 MiB 00:07:11.941 element at address: 0x2000002107c0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210880 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210940 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210a00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000210c00 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000214ec0 with size: 0.000183 MiB 00:07:11.941 element at address: 0x200000235180 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235240 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235300 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002353c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235480 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235540 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235600 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002356c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235780 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235840 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235900 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002359c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235a80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235b40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235c00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235cc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235d80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235e40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000235f00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236100 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002361c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236280 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236340 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236400 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002364c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236580 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236640 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236700 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000002367c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236880 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236940 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236a00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236ac0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236b80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236c40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000236d00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000338f00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000338fc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000033c540 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000033fac0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000343040 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003465c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000349b40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000034d0c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000350640 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000353bc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000357140 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000035a6c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000035dc40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003611c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000364740 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000367cc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000036b240 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000036e7c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000371d40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003752c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000378840 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000037bdc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000037f340 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003828c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000385e40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003893c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000038c940 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000038fec0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000393440 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003969c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200000399f40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000039d4c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003a0a40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003a3fc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003a7540 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003aaac0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003ae040 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003b4b40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003b80c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003bb640 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003bebc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003c2140 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003c56c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003c8c40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003cc1c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003cf740 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003d2cc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087c740 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087c800 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e664c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e66580 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d180 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:11.942 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:11.943 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:11.943 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:11.943 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:11.943 list of memzone associated elements. size: 602.284851 MiB 00:07:11.943 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:11.943 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:11.943 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:11.943 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:11.943 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:11.943 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_919602_0 00:07:11.943 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:11.943 associated memzone info: size: 48.002930 MiB name: MP_evtpool_919602_0 00:07:11.943 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:11.943 associated memzone info: size: 48.002930 MiB name: MP_msgpool_919602_0 00:07:11.943 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:11.943 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:11.943 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:11.943 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:11.943 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:11.943 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_919602 00:07:11.943 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:11.943 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_919602 00:07:11.943 element at address: 0x200000236dc0 with size: 1.008118 MiB 00:07:11.943 associated memzone info: size: 1.007996 MiB name: MP_evtpool_919602 00:07:11.943 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:11.943 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:11.943 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:11.943 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:11.943 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:11.943 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:11.943 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:11.943 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:11.943 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:11.943 associated memzone info: size: 1.000366 MiB name: RG_ring_0_919602 00:07:11.943 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:11.943 associated memzone info: size: 1.000366 MiB name: RG_ring_1_919602 00:07:11.943 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:11.943 associated memzone info: size: 1.000366 MiB name: RG_ring_4_919602 00:07:11.943 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:11.943 associated memzone info: size: 1.000366 MiB name: RG_ring_5_919602 00:07:11.943 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:11.943 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_919602 00:07:11.943 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:11.943 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:11.943 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:11.943 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:11.943 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:11.943 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:11.943 element at address: 0x200000214f80 with size: 0.125488 MiB 00:07:11.943 associated memzone info: size: 0.125366 MiB name: RG_ring_2_919602 00:07:11.943 element at address: 0x200000200ec0 with size: 0.031738 MiB 00:07:11.943 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:11.943 element at address: 0x200027e66640 with size: 0.023743 MiB 00:07:11.943 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:11.943 element at address: 0x200000210cc0 with size: 0.016113 MiB 00:07:11.943 associated memzone info: size: 0.015991 MiB name: RG_ring_3_919602 00:07:11.943 element at address: 0x200027e6c780 with size: 0.002441 MiB 00:07:11.943 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:11.943 element at address: 0x2000003d6300 with size: 0.001282 MiB 00:07:11.943 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.943 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:11.943 element at address: 0x2000003d2d80 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:11.943 element at address: 0x2000003cf800 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:11.943 element at address: 0x2000003cc280 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:11.943 element at address: 0x2000003c8d00 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:11.943 element at address: 0x2000003c5780 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:11.943 element at address: 0x2000003c2200 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:11.943 element at address: 0x2000003bec80 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:11.943 element at address: 0x2000003bb700 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:11.943 element at address: 0x2000003b8180 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:11.943 element at address: 0x2000003b4c00 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:11.943 element at address: 0x2000003b1680 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:11.943 element at address: 0x2000003ae100 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:11.943 element at address: 0x2000003aab80 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:11.943 element at address: 0x2000003a7600 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:11.943 element at address: 0x2000003a4080 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:11.943 element at address: 0x2000003a0b00 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:11.943 element at address: 0x20000039d580 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:11.943 element at address: 0x20000039a000 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:11.943 element at address: 0x200000396a80 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:11.943 element at address: 0x200000393500 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:11.943 element at address: 0x20000038ff80 with size: 0.000427 MiB 00:07:11.943 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:11.943 element at address: 0x20000038ca00 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:11.944 element at address: 0x200000389480 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:11.944 element at address: 0x200000385f00 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:11.944 element at address: 0x200000382980 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:11.944 element at address: 0x20000037f400 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:11.944 element at address: 0x20000037be80 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:11.944 element at address: 0x200000378900 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:11.944 element at address: 0x200000375380 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:11.944 element at address: 0x200000371e00 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:11.944 element at address: 0x20000036e880 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:11.944 element at address: 0x20000036b300 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:11.944 element at address: 0x200000367d80 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:11.944 element at address: 0x200000364800 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:11.944 element at address: 0x200000361280 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:11.944 element at address: 0x20000035dd00 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:11.944 element at address: 0x20000035a780 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:11.944 element at address: 0x200000357200 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:11.944 element at address: 0x200000353c80 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:11.944 element at address: 0x200000350700 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:11.944 element at address: 0x20000034d180 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:11.944 element at address: 0x200000349c00 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:11.944 element at address: 0x200000346680 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:11.944 element at address: 0x200000343100 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:11.944 element at address: 0x20000033fb80 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:11.944 element at address: 0x20000033c600 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:11.944 element at address: 0x200000339080 with size: 0.000427 MiB 00:07:11.944 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:11.944 element at address: 0x2000003d6900 with size: 0.000305 MiB 00:07:11.944 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.944 element at address: 0x200000235fc0 with size: 0.000305 MiB 00:07:11.944 associated memzone info: size: 0.000183 MiB name: MP_msgpool_919602 00:07:11.944 element at address: 0x200000210ac0 with size: 0.000305 MiB 00:07:11.944 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_919602 00:07:11.944 element at address: 0x200027e6d240 with size: 0.000305 MiB 00:07:11.944 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:11.944 element at address: 0x2000003d6240 with size: 0.000183 MiB 00:07:11.944 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.944 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:11.944 10:02:33 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 919602 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@949 -- # '[' -z 919602 ']' 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@953 -- # kill -0 919602 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@954 -- # uname 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 919602 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@967 -- # echo 'killing process with pid 919602' 00:07:11.944 killing process with pid 919602 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@968 -- # kill 919602 00:07:11.944 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@973 -- # wait 919602 00:07:12.205 00:07:12.205 real 0m1.432s 00:07:12.205 user 0m1.622s 00:07:12.205 sys 0m0.403s 00:07:12.205 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:12.205 10:02:33 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:12.205 ************************************ 00:07:12.205 END TEST dpdk_mem_utility 00:07:12.205 ************************************ 00:07:12.205 10:02:33 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:12.205 10:02:33 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:12.205 10:02:33 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:12.205 10:02:33 -- common/autotest_common.sh@10 -- # set +x 00:07:12.205 ************************************ 00:07:12.205 START TEST event 00:07:12.205 ************************************ 00:07:12.205 10:02:33 event -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:12.205 * Looking for test storage... 00:07:12.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:12.465 10:02:34 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:12.465 10:02:34 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:12.465 10:02:34 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:12.465 10:02:34 event -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:07:12.465 10:02:34 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:12.465 10:02:34 event -- common/autotest_common.sh@10 -- # set +x 00:07:12.465 ************************************ 00:07:12.465 START TEST event_perf 00:07:12.465 ************************************ 00:07:12.465 10:02:34 event.event_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:12.465 Running I/O for 1 seconds...[2024-06-10 10:02:34.138693] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:12.466 [2024-06-10 10:02:34.138790] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid919828 ] 00:07:12.466 [2024-06-10 10:02:34.234676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.466 [2024-06-10 10:02:34.312915] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.466 [2024-06-10 10:02:34.313067] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.466 [2024-06-10 10:02:34.313214] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.466 Running I/O for 1 seconds...[2024-06-10 10:02:34.313214] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:13.848 00:07:13.848 lcore 0: 177755 00:07:13.848 lcore 1: 177756 00:07:13.848 lcore 2: 177752 00:07:13.848 lcore 3: 177752 00:07:13.848 done. 00:07:13.848 00:07:13.848 real 0m1.254s 00:07:13.848 user 0m4.154s 00:07:13.848 sys 0m0.096s 00:07:13.848 10:02:35 event.event_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:13.848 10:02:35 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:13.848 ************************************ 00:07:13.848 END TEST event_perf 00:07:13.848 ************************************ 00:07:13.848 10:02:35 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:13.848 10:02:35 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:07:13.848 10:02:35 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:13.848 10:02:35 event -- common/autotest_common.sh@10 -- # set +x 00:07:13.848 ************************************ 00:07:13.848 START TEST event_reactor 00:07:13.848 ************************************ 00:07:13.848 10:02:35 event.event_reactor -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:13.848 [2024-06-10 10:02:35.461865] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:13.848 [2024-06-10 10:02:35.461924] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid920061 ] 00:07:13.848 [2024-06-10 10:02:35.552903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.848 [2024-06-10 10:02:35.617966] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.230 test_start 00:07:15.230 oneshot 00:07:15.230 tick 100 00:07:15.230 tick 100 00:07:15.230 tick 250 00:07:15.230 tick 100 00:07:15.230 tick 100 00:07:15.230 tick 250 00:07:15.230 tick 100 00:07:15.230 tick 500 00:07:15.230 tick 100 00:07:15.230 tick 100 00:07:15.230 tick 250 00:07:15.230 tick 100 00:07:15.230 tick 100 00:07:15.230 test_end 00:07:15.230 00:07:15.230 real 0m1.231s 00:07:15.230 user 0m1.125s 00:07:15.230 sys 0m0.101s 00:07:15.230 10:02:36 event.event_reactor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:15.230 10:02:36 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:15.230 ************************************ 00:07:15.230 END TEST event_reactor 00:07:15.230 ************************************ 00:07:15.230 10:02:36 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:15.230 10:02:36 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:07:15.230 10:02:36 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:15.230 10:02:36 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.230 ************************************ 00:07:15.230 START TEST event_reactor_perf 00:07:15.230 ************************************ 00:07:15.230 10:02:36 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:15.230 [2024-06-10 10:02:36.752619] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:15.230 [2024-06-10 10:02:36.752665] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid920380 ] 00:07:15.230 [2024-06-10 10:02:36.840959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.230 [2024-06-10 10:02:36.903171] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.274 test_start 00:07:16.274 test_end 00:07:16.275 Performance: 398741 events per second 00:07:16.275 00:07:16.275 real 0m1.215s 00:07:16.275 user 0m1.130s 00:07:16.275 sys 0m0.081s 00:07:16.275 10:02:37 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:16.275 10:02:37 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:16.275 ************************************ 00:07:16.275 END TEST event_reactor_perf 00:07:16.275 ************************************ 00:07:16.275 10:02:37 event -- event/event.sh@49 -- # uname -s 00:07:16.275 10:02:37 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:16.275 10:02:37 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.275 10:02:37 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:16.275 10:02:37 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:16.275 10:02:37 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.275 ************************************ 00:07:16.275 START TEST event_scheduler 00:07:16.275 ************************************ 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.275 * Looking for test storage... 00:07:16.275 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:16.275 10:02:38 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:16.275 10:02:38 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=920733 00:07:16.275 10:02:38 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:16.275 10:02:38 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:16.275 10:02:38 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 920733 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@830 -- # '[' -z 920733 ']' 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.275 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:16.275 10:02:38 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:16.536 [2024-06-10 10:02:38.192801] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:16.536 [2024-06-10 10:02:38.192873] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid920733 ] 00:07:16.536 [2024-06-10 10:02:38.286839] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:16.536 [2024-06-10 10:02:38.379710] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.536 [2024-06-10 10:02:38.379850] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.536 [2024-06-10 10:02:38.379954] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.536 [2024-06-10 10:02:38.379955] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@863 -- # return 0 00:07:17.479 10:02:39 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 POWER: Env isn't set yet! 00:07:17.479 POWER: Attempting to initialise ACPI cpufreq power management... 00:07:17.479 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:07:17.479 POWER: Cannot set governor of lcore 0 to userspace 00:07:17.479 POWER: Attempting to initialise PSTAT power management... 00:07:17.479 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:07:17.479 POWER: Initialized successfully for lcore 0 power management 00:07:17.479 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:07:17.479 POWER: Initialized successfully for lcore 1 power management 00:07:17.479 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:07:17.479 POWER: Initialized successfully for lcore 2 power management 00:07:17.479 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:07:17.479 POWER: Initialized successfully for lcore 3 power management 00:07:17.479 [2024-06-10 10:02:39.081024] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:17.479 [2024-06-10 10:02:39.081036] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:17.479 [2024-06-10 10:02:39.081041] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 [2024-06-10 10:02:39.153011] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 ************************************ 00:07:17.479 START TEST scheduler_create_thread 00:07:17.479 ************************************ 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # scheduler_create_thread 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 2 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 3 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 4 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 5 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 6 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 7 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.479 8 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:17.479 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.480 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.480 9 00:07:17.480 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.480 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:17.480 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.480 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.052 10 00:07:18.052 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:18.052 10:02:39 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:18.052 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:18.052 10:02:39 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.437 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:19.437 10:02:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:19.437 10:02:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:19.437 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:19.437 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:20.379 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:20.379 10:02:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:20.379 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:20.379 10:02:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:20.951 10:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:20.951 10:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:20.951 10:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:20.951 10:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:20.951 10:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.892 10:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.892 00:07:21.892 real 0m4.225s 00:07:21.892 user 0m0.026s 00:07:21.893 sys 0m0.006s 00:07:21.893 10:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:21.893 10:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.893 ************************************ 00:07:21.893 END TEST scheduler_create_thread 00:07:21.893 ************************************ 00:07:21.893 10:02:43 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:21.893 10:02:43 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 920733 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@949 -- # '[' -z 920733 ']' 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@953 -- # kill -0 920733 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@954 -- # uname 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 920733 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@967 -- # echo 'killing process with pid 920733' 00:07:21.893 killing process with pid 920733 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@968 -- # kill 920733 00:07:21.893 10:02:43 event.event_scheduler -- common/autotest_common.sh@973 -- # wait 920733 00:07:22.153 [2024-06-10 10:02:43.795065] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:22.153 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:07:22.153 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:07:22.153 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:07:22.153 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:07:22.153 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:07:22.153 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:07:22.153 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:07:22.153 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:07:22.153 00:07:22.154 real 0m5.935s 00:07:22.154 user 0m13.896s 00:07:22.154 sys 0m0.411s 00:07:22.154 10:02:43 event.event_scheduler -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:22.154 10:02:43 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:22.154 ************************************ 00:07:22.154 END TEST event_scheduler 00:07:22.154 ************************************ 00:07:22.154 10:02:44 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:22.154 10:02:44 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:22.154 10:02:44 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:22.154 10:02:44 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:22.154 10:02:44 event -- common/autotest_common.sh@10 -- # set +x 00:07:22.414 ************************************ 00:07:22.414 START TEST app_repeat 00:07:22.414 ************************************ 00:07:22.414 10:02:44 event.app_repeat -- common/autotest_common.sh@1124 -- # app_repeat_test 00:07:22.414 10:02:44 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:22.414 10:02:44 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.414 10:02:44 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:22.414 10:02:44 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@19 -- # repeat_pid=921699 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 921699' 00:07:22.415 Process app_repeat pid: 921699 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:22.415 spdk_app_start Round 0 00:07:22.415 10:02:44 event.app_repeat -- event/event.sh@25 -- # waitforlisten 921699 /var/tmp/spdk-nbd.sock 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 921699 ']' 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:22.415 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:22.415 10:02:44 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:22.415 [2024-06-10 10:02:44.096250] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:22.415 [2024-06-10 10:02:44.096310] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid921699 ] 00:07:22.415 [2024-06-10 10:02:44.185165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.415 [2024-06-10 10:02:44.249816] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.415 [2024-06-10 10:02:44.249826] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.354 10:02:44 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:23.354 10:02:44 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:07:23.354 10:02:44 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.354 Malloc0 00:07:23.354 10:02:45 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.615 Malloc1 00:07:23.615 10:02:45 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.615 10:02:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:23.875 /dev/nbd0 00:07:23.875 10:02:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:23.875 10:02:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.875 1+0 records in 00:07:23.875 1+0 records out 00:07:23.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265326 s, 15.4 MB/s 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:23.875 10:02:45 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:23.875 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.875 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.875 10:02:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:24.136 /dev/nbd1 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.136 1+0 records in 00:07:24.136 1+0 records out 00:07:24.136 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277608 s, 14.8 MB/s 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:24.136 10:02:45 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.136 { 00:07:24.136 "nbd_device": "/dev/nbd0", 00:07:24.136 "bdev_name": "Malloc0" 00:07:24.136 }, 00:07:24.136 { 00:07:24.136 "nbd_device": "/dev/nbd1", 00:07:24.136 "bdev_name": "Malloc1" 00:07:24.136 } 00:07:24.136 ]' 00:07:24.136 10:02:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.136 { 00:07:24.136 "nbd_device": "/dev/nbd0", 00:07:24.137 "bdev_name": "Malloc0" 00:07:24.137 }, 00:07:24.137 { 00:07:24.137 "nbd_device": "/dev/nbd1", 00:07:24.137 "bdev_name": "Malloc1" 00:07:24.137 } 00:07:24.137 ]' 00:07:24.137 10:02:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.397 /dev/nbd1' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.397 /dev/nbd1' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.397 256+0 records in 00:07:24.397 256+0 records out 00:07:24.397 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125091 s, 83.8 MB/s 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.397 256+0 records in 00:07:24.397 256+0 records out 00:07:24.397 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0144694 s, 72.5 MB/s 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.397 256+0 records in 00:07:24.397 256+0 records out 00:07:24.397 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219314 s, 47.8 MB/s 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.397 10:02:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.657 10:02:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:24.918 10:02:46 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:24.918 10:02:46 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:25.179 10:02:46 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:25.440 [2024-06-10 10:02:47.097202] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.440 [2024-06-10 10:02:47.159920] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.440 [2024-06-10 10:02:47.159924] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.440 [2024-06-10 10:02:47.190583] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:25.440 [2024-06-10 10:02:47.190618] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:28.740 10:02:49 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:28.740 10:02:49 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:28.740 spdk_app_start Round 1 00:07:28.740 10:02:49 event.app_repeat -- event/event.sh@25 -- # waitforlisten 921699 /var/tmp/spdk-nbd.sock 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 921699 ']' 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:28.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:28.740 10:02:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:28.740 10:02:50 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:28.740 10:02:50 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:07:28.740 10:02:50 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.740 Malloc0 00:07:28.740 10:02:50 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.740 Malloc1 00:07:28.740 10:02:50 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:28.740 10:02:50 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.002 /dev/nbd0 00:07:29.002 10:02:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.002 10:02:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.002 1+0 records in 00:07:29.002 1+0 records out 00:07:29.002 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024147 s, 17.0 MB/s 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:29.002 10:02:50 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:29.002 10:02:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.002 10:02:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.002 10:02:50 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:29.263 /dev/nbd1 00:07:29.263 10:02:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:29.263 10:02:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.263 1+0 records in 00:07:29.263 1+0 records out 00:07:29.263 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269316 s, 15.2 MB/s 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:29.263 10:02:50 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.263 10:02:51 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:29.263 10:02:51 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:29.263 10:02:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.263 10:02:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.263 10:02:51 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.263 10:02:51 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.263 10:02:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:29.525 { 00:07:29.525 "nbd_device": "/dev/nbd0", 00:07:29.525 "bdev_name": "Malloc0" 00:07:29.525 }, 00:07:29.525 { 00:07:29.525 "nbd_device": "/dev/nbd1", 00:07:29.525 "bdev_name": "Malloc1" 00:07:29.525 } 00:07:29.525 ]' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:29.525 { 00:07:29.525 "nbd_device": "/dev/nbd0", 00:07:29.525 "bdev_name": "Malloc0" 00:07:29.525 }, 00:07:29.525 { 00:07:29.525 "nbd_device": "/dev/nbd1", 00:07:29.525 "bdev_name": "Malloc1" 00:07:29.525 } 00:07:29.525 ]' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:29.525 /dev/nbd1' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:29.525 /dev/nbd1' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:29.525 256+0 records in 00:07:29.525 256+0 records out 00:07:29.525 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0120556 s, 87.0 MB/s 00:07:29.525 10:02:51 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:29.526 256+0 records in 00:07:29.526 256+0 records out 00:07:29.526 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0244527 s, 42.9 MB/s 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:29.526 256+0 records in 00:07:29.526 256+0 records out 00:07:29.526 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0163659 s, 64.1 MB/s 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.526 10:02:51 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.786 10:02:51 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.047 10:02:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:30.307 10:02:51 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:30.307 10:02:51 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:30.568 10:02:52 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:30.568 [2024-06-10 10:02:52.318234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:30.568 [2024-06-10 10:02:52.380370] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.568 [2024-06-10 10:02:52.380374] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.568 [2024-06-10 10:02:52.411650] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:30.568 [2024-06-10 10:02:52.411687] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:33.865 10:02:55 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:33.865 10:02:55 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:33.865 spdk_app_start Round 2 00:07:33.865 10:02:55 event.app_repeat -- event/event.sh@25 -- # waitforlisten 921699 /var/tmp/spdk-nbd.sock 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 921699 ']' 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:33.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:33.865 10:02:55 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:07:33.865 10:02:55 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:33.865 Malloc0 00:07:33.865 10:02:55 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.125 Malloc1 00:07:34.125 10:02:55 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:34.125 /dev/nbd0 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.125 1+0 records in 00:07:34.125 1+0 records out 00:07:34.125 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294732 s, 13.9 MB/s 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.125 10:02:55 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.125 10:02:55 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:34.386 /dev/nbd1 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.386 1+0 records in 00:07:34.386 1+0 records out 00:07:34.386 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268951 s, 15.2 MB/s 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.386 10:02:56 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.386 10:02:56 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:34.646 { 00:07:34.646 "nbd_device": "/dev/nbd0", 00:07:34.646 "bdev_name": "Malloc0" 00:07:34.646 }, 00:07:34.646 { 00:07:34.646 "nbd_device": "/dev/nbd1", 00:07:34.646 "bdev_name": "Malloc1" 00:07:34.646 } 00:07:34.646 ]' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:34.646 { 00:07:34.646 "nbd_device": "/dev/nbd0", 00:07:34.646 "bdev_name": "Malloc0" 00:07:34.646 }, 00:07:34.646 { 00:07:34.646 "nbd_device": "/dev/nbd1", 00:07:34.646 "bdev_name": "Malloc1" 00:07:34.646 } 00:07:34.646 ]' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:34.646 /dev/nbd1' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:34.646 /dev/nbd1' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:34.646 256+0 records in 00:07:34.646 256+0 records out 00:07:34.646 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011587 s, 90.5 MB/s 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:34.646 256+0 records in 00:07:34.646 256+0 records out 00:07:34.646 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0147118 s, 71.3 MB/s 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.646 10:02:56 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:34.905 256+0 records in 00:07:34.906 256+0 records out 00:07:34.906 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0163143 s, 64.3 MB/s 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:34.906 10:02:56 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.166 10:02:56 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:35.427 10:02:57 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:35.427 10:02:57 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:35.688 10:02:57 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:35.689 [2024-06-10 10:02:57.514465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:35.949 [2024-06-10 10:02:57.576491] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:35.949 [2024-06-10 10:02:57.576496] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.949 [2024-06-10 10:02:57.606896] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:35.949 [2024-06-10 10:02:57.606932] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:39.256 10:03:00 event.app_repeat -- event/event.sh@38 -- # waitforlisten 921699 /var/tmp/spdk-nbd.sock 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 921699 ']' 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:39.256 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:07:39.256 10:03:00 event.app_repeat -- event/event.sh@39 -- # killprocess 921699 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@949 -- # '[' -z 921699 ']' 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@953 -- # kill -0 921699 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@954 -- # uname 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 921699 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 921699' 00:07:39.256 killing process with pid 921699 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@968 -- # kill 921699 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@973 -- # wait 921699 00:07:39.256 spdk_app_start is called in Round 0. 00:07:39.256 Shutdown signal received, stop current app iteration 00:07:39.256 Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 reinitialization... 00:07:39.256 spdk_app_start is called in Round 1. 00:07:39.256 Shutdown signal received, stop current app iteration 00:07:39.256 Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 reinitialization... 00:07:39.256 spdk_app_start is called in Round 2. 00:07:39.256 Shutdown signal received, stop current app iteration 00:07:39.256 Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 reinitialization... 00:07:39.256 spdk_app_start is called in Round 3. 00:07:39.256 Shutdown signal received, stop current app iteration 00:07:39.256 10:03:00 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:39.256 10:03:00 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:39.256 00:07:39.256 real 0m16.684s 00:07:39.256 user 0m36.707s 00:07:39.256 sys 0m2.415s 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:39.256 10:03:00 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.256 ************************************ 00:07:39.256 END TEST app_repeat 00:07:39.256 ************************************ 00:07:39.256 10:03:00 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:39.256 00:07:39.256 real 0m26.809s 00:07:39.256 user 0m57.205s 00:07:39.256 sys 0m3.429s 00:07:39.256 10:03:00 event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:39.256 10:03:00 event -- common/autotest_common.sh@10 -- # set +x 00:07:39.256 ************************************ 00:07:39.256 END TEST event 00:07:39.256 ************************************ 00:07:39.256 10:03:00 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:39.256 10:03:00 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:39.256 10:03:00 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:39.256 10:03:00 -- common/autotest_common.sh@10 -- # set +x 00:07:39.256 ************************************ 00:07:39.256 START TEST thread 00:07:39.257 ************************************ 00:07:39.257 10:03:00 thread -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:39.257 * Looking for test storage... 00:07:39.257 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:39.257 10:03:00 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.257 10:03:00 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:07:39.257 10:03:00 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:39.257 10:03:00 thread -- common/autotest_common.sh@10 -- # set +x 00:07:39.257 ************************************ 00:07:39.257 START TEST thread_poller_perf 00:07:39.257 ************************************ 00:07:39.257 10:03:00 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.257 [2024-06-10 10:03:01.005935] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:39.257 [2024-06-10 10:03:01.005998] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid924768 ] 00:07:39.257 [2024-06-10 10:03:01.080074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.517 [2024-06-10 10:03:01.142886] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.518 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:40.473 ====================================== 00:07:40.473 busy:2609477426 (cyc) 00:07:40.473 total_run_count: 311000 00:07:40.473 tsc_hz: 2600000000 (cyc) 00:07:40.473 ====================================== 00:07:40.473 poller_cost: 8390 (cyc), 3226 (nsec) 00:07:40.473 00:07:40.473 real 0m1.223s 00:07:40.473 user 0m1.137s 00:07:40.473 sys 0m0.078s 00:07:40.473 10:03:02 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:40.473 10:03:02 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:40.473 ************************************ 00:07:40.473 END TEST thread_poller_perf 00:07:40.473 ************************************ 00:07:40.473 10:03:02 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:40.473 10:03:02 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:07:40.473 10:03:02 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:40.473 10:03:02 thread -- common/autotest_common.sh@10 -- # set +x 00:07:40.473 ************************************ 00:07:40.473 START TEST thread_poller_perf 00:07:40.473 ************************************ 00:07:40.473 10:03:02 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:40.473 [2024-06-10 10:03:02.308224] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:40.474 [2024-06-10 10:03:02.308292] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid925154 ] 00:07:40.738 [2024-06-10 10:03:02.398121] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.738 [2024-06-10 10:03:02.464545] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.738 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:41.677 ====================================== 00:07:41.677 busy:2602106690 (cyc) 00:07:41.677 total_run_count: 4083000 00:07:41.677 tsc_hz: 2600000000 (cyc) 00:07:41.677 ====================================== 00:07:41.677 poller_cost: 637 (cyc), 245 (nsec) 00:07:41.677 00:07:41.677 real 0m1.237s 00:07:41.677 user 0m1.138s 00:07:41.677 sys 0m0.095s 00:07:41.677 10:03:03 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:41.677 10:03:03 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:41.677 ************************************ 00:07:41.677 END TEST thread_poller_perf 00:07:41.677 ************************************ 00:07:41.937 10:03:03 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:41.937 00:07:41.937 real 0m2.710s 00:07:41.937 user 0m2.368s 00:07:41.937 sys 0m0.346s 00:07:41.937 10:03:03 thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:41.937 10:03:03 thread -- common/autotest_common.sh@10 -- # set +x 00:07:41.937 ************************************ 00:07:41.937 END TEST thread 00:07:41.937 ************************************ 00:07:41.937 10:03:03 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:41.937 10:03:03 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:41.937 10:03:03 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:41.937 10:03:03 -- common/autotest_common.sh@10 -- # set +x 00:07:41.937 ************************************ 00:07:41.937 START TEST accel 00:07:41.937 ************************************ 00:07:41.937 10:03:03 accel -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:41.937 * Looking for test storage... 00:07:41.938 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:41.938 10:03:03 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:41.938 10:03:03 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:41.938 10:03:03 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:41.938 10:03:03 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=925515 00:07:41.938 10:03:03 accel -- accel/accel.sh@63 -- # waitforlisten 925515 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@830 -- # '[' -z 925515 ']' 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:41.938 10:03:03 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.938 10:03:03 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:41.938 10:03:03 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.938 10:03:03 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.938 10:03:03 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.938 10:03:03 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.938 10:03:03 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.938 10:03:03 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.938 10:03:03 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:41.938 10:03:03 accel -- accel/accel.sh@41 -- # jq -r . 00:07:41.938 [2024-06-10 10:03:03.787692] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:41.938 [2024-06-10 10:03:03.787757] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid925515 ] 00:07:42.197 [2024-06-10 10:03:03.881352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.197 [2024-06-10 10:03:03.948816] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.879 10:03:04 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:42.879 10:03:04 accel -- common/autotest_common.sh@863 -- # return 0 00:07:42.879 10:03:04 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:42.879 10:03:04 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:42.879 10:03:04 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:42.879 10:03:04 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:42.879 10:03:04 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:42.879 10:03:04 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:42.879 10:03:04 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:42.879 10:03:04 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:42.879 10:03:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.879 10:03:04 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.879 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.879 10:03:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:42.879 10:03:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:42.880 10:03:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:42.880 10:03:04 accel -- accel/accel.sh@75 -- # killprocess 925515 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@949 -- # '[' -z 925515 ']' 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@953 -- # kill -0 925515 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@954 -- # uname 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 925515 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 925515' 00:07:42.880 killing process with pid 925515 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@968 -- # kill 925515 00:07:42.880 10:03:04 accel -- common/autotest_common.sh@973 -- # wait 925515 00:07:43.140 10:03:04 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:43.140 10:03:04 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:43.140 10:03:04 accel -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:07:43.140 10:03:04 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:43.140 10:03:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.140 10:03:04 accel.accel_help -- common/autotest_common.sh@1124 -- # accel_perf -h 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:43.140 10:03:04 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:43.140 10:03:04 accel.accel_help -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:43.140 10:03:04 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:43.400 10:03:05 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:43.400 10:03:05 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:43.400 10:03:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:43.400 10:03:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.400 ************************************ 00:07:43.400 START TEST accel_missing_filename 00:07:43.400 ************************************ 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@649 -- # local es=0 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # type -t accel_perf 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:43.400 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:43.400 10:03:05 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:43.400 [2024-06-10 10:03:05.106525] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:43.400 [2024-06-10 10:03:05.106591] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid925688 ] 00:07:43.400 [2024-06-10 10:03:05.199982] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.662 [2024-06-10 10:03:05.275482] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.662 [2024-06-10 10:03:05.319350] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:43.662 [2024-06-10 10:03:05.356132] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:43.662 A filename is required. 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # es=234 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # es=106 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # case "$es" in 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@669 -- # es=1 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:43.662 00:07:43.662 real 0m0.334s 00:07:43.662 user 0m0.225s 00:07:43.662 sys 0m0.137s 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:43.662 10:03:05 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:43.662 ************************************ 00:07:43.662 END TEST accel_missing_filename 00:07:43.662 ************************************ 00:07:43.662 10:03:05 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:43.662 10:03:05 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:07:43.662 10:03:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:43.662 10:03:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.662 ************************************ 00:07:43.662 START TEST accel_compress_verify 00:07:43.662 ************************************ 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@649 -- # local es=0 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # type -t accel_perf 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:43.662 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:43.662 10:03:05 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:43.662 [2024-06-10 10:03:05.505891] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:43.662 [2024-06-10 10:03:05.505950] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid925878 ] 00:07:43.921 [2024-06-10 10:03:05.596813] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.921 [2024-06-10 10:03:05.668334] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.921 [2024-06-10 10:03:05.708218] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:43.921 [2024-06-10 10:03:05.744815] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:44.182 00:07:44.182 Compression does not support the verify option, aborting. 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # es=161 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # es=33 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # case "$es" in 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@669 -- # es=1 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:44.182 00:07:44.182 real 0m0.322s 00:07:44.182 user 0m0.229s 00:07:44.182 sys 0m0.122s 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:44.182 10:03:05 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:44.182 ************************************ 00:07:44.182 END TEST accel_compress_verify 00:07:44.182 ************************************ 00:07:44.182 10:03:05 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.182 ************************************ 00:07:44.182 START TEST accel_wrong_workload 00:07:44.182 ************************************ 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w foobar 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@649 -- # local es=0 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # type -t accel_perf 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w foobar 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:44.182 10:03:05 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:44.182 Unsupported workload type: foobar 00:07:44.182 [2024-06-10 10:03:05.893229] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:44.182 accel_perf options: 00:07:44.182 [-h help message] 00:07:44.182 [-q queue depth per core] 00:07:44.182 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:44.182 [-T number of threads per core 00:07:44.182 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:44.182 [-t time in seconds] 00:07:44.182 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:44.182 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:44.182 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:44.182 [-l for compress/decompress workloads, name of uncompressed input file 00:07:44.182 [-S for crc32c workload, use this seed value (default 0) 00:07:44.182 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:44.182 [-f for fill workload, use this BYTE value (default 255) 00:07:44.182 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:44.182 [-y verify result if this switch is on] 00:07:44.182 [-a tasks to allocate per core (default: same value as -q)] 00:07:44.182 Can be used to spread operations across a wider range of memory. 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # es=1 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:44.182 00:07:44.182 real 0m0.036s 00:07:44.182 user 0m0.026s 00:07:44.182 sys 0m0.009s 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:44.182 10:03:05 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:44.182 ************************************ 00:07:44.182 END TEST accel_wrong_workload 00:07:44.182 ************************************ 00:07:44.182 Error: writing output failed: Broken pipe 00:07:44.182 10:03:05 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:44.182 10:03:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.182 ************************************ 00:07:44.182 START TEST accel_negative_buffers 00:07:44.182 ************************************ 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@649 -- # local es=0 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # type -t accel_perf 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:44.182 10:03:05 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w xor -y -x -1 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:44.182 10:03:05 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:44.182 -x option must be non-negative. 00:07:44.182 [2024-06-10 10:03:06.001751] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:44.183 accel_perf options: 00:07:44.183 [-h help message] 00:07:44.183 [-q queue depth per core] 00:07:44.183 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:44.183 [-T number of threads per core 00:07:44.183 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:44.183 [-t time in seconds] 00:07:44.183 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:44.183 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:44.183 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:44.183 [-l for compress/decompress workloads, name of uncompressed input file 00:07:44.183 [-S for crc32c workload, use this seed value (default 0) 00:07:44.183 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:44.183 [-f for fill workload, use this BYTE value (default 255) 00:07:44.183 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:44.183 [-y verify result if this switch is on] 00:07:44.183 [-a tasks to allocate per core (default: same value as -q)] 00:07:44.183 Can be used to spread operations across a wider range of memory. 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # es=1 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:44.183 00:07:44.183 real 0m0.038s 00:07:44.183 user 0m0.023s 00:07:44.183 sys 0m0.015s 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:44.183 10:03:06 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:44.183 ************************************ 00:07:44.183 END TEST accel_negative_buffers 00:07:44.183 ************************************ 00:07:44.183 Error: writing output failed: Broken pipe 00:07:44.183 10:03:06 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:44.183 10:03:06 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:44.183 10:03:06 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:44.183 10:03:06 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.443 ************************************ 00:07:44.443 START TEST accel_crc32c 00:07:44.443 ************************************ 00:07:44.443 10:03:06 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:44.443 10:03:06 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:44.443 [2024-06-10 10:03:06.110612] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:44.443 [2024-06-10 10:03:06.110688] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid925949 ] 00:07:44.443 [2024-06-10 10:03:06.200288] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.443 [2024-06-10 10:03:06.276370] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.704 10:03:06 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:45.645 10:03:07 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.645 00:07:45.645 real 0m1.338s 00:07:45.645 user 0m0.005s 00:07:45.645 sys 0m0.000s 00:07:45.645 10:03:07 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:45.645 10:03:07 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:45.645 ************************************ 00:07:45.645 END TEST accel_crc32c 00:07:45.645 ************************************ 00:07:45.645 10:03:07 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:45.645 10:03:07 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:45.645 10:03:07 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:45.645 10:03:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.645 ************************************ 00:07:45.645 START TEST accel_crc32c_C2 00:07:45.645 ************************************ 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:45.645 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:45.906 [2024-06-10 10:03:07.517655] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:45.906 [2024-06-10 10:03:07.517712] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid926267 ] 00:07:45.906 [2024-06-10 10:03:07.607510] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.906 [2024-06-10 10:03:07.679282] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:45.906 10:03:07 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.290 00:07:47.290 real 0m1.325s 00:07:47.290 user 0m0.005s 00:07:47.290 sys 0m0.001s 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:47.290 10:03:08 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:47.290 ************************************ 00:07:47.290 END TEST accel_crc32c_C2 00:07:47.290 ************************************ 00:07:47.290 10:03:08 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:47.290 10:03:08 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:47.290 10:03:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:47.290 10:03:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.290 ************************************ 00:07:47.290 START TEST accel_copy 00:07:47.290 ************************************ 00:07:47.290 10:03:08 accel.accel_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy -y 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:47.290 10:03:08 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:47.290 [2024-06-10 10:03:08.910059] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:47.290 [2024-06-10 10:03:08.910126] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid926519 ] 00:07:47.290 [2024-06-10 10:03:08.996464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.290 [2024-06-10 10:03:09.062572] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.290 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.291 10:03:09 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:48.673 10:03:10 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:48.673 00:07:48.673 real 0m1.315s 00:07:48.673 user 0m0.005s 00:07:48.673 sys 0m0.000s 00:07:48.673 10:03:10 accel.accel_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:48.673 10:03:10 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:48.673 ************************************ 00:07:48.673 END TEST accel_copy 00:07:48.673 ************************************ 00:07:48.673 10:03:10 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:48.673 10:03:10 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:48.673 10:03:10 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:48.673 10:03:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.673 ************************************ 00:07:48.673 START TEST accel_fill 00:07:48.673 ************************************ 00:07:48.673 10:03:10 accel.accel_fill -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:48.673 [2024-06-10 10:03:10.295028] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:48.673 [2024-06-10 10:03:10.295087] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid926941 ] 00:07:48.673 [2024-06-10 10:03:10.384452] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.673 [2024-06-10 10:03:10.457113] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.673 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:48.674 10:03:10 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:50.057 10:03:11 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.057 00:07:50.057 real 0m1.338s 00:07:50.057 user 0m0.005s 00:07:50.057 sys 0m0.001s 00:07:50.057 10:03:11 accel.accel_fill -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:50.057 10:03:11 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:50.057 ************************************ 00:07:50.057 END TEST accel_fill 00:07:50.057 ************************************ 00:07:50.057 10:03:11 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:50.057 10:03:11 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:50.057 10:03:11 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:50.057 10:03:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.057 ************************************ 00:07:50.057 START TEST accel_copy_crc32c 00:07:50.057 ************************************ 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:50.057 [2024-06-10 10:03:11.698772] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:50.057 [2024-06-10 10:03:11.698852] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid927394 ] 00:07:50.057 [2024-06-10 10:03:11.785300] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.057 [2024-06-10 10:03:11.848604] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:50.057 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.058 10:03:11 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.438 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.439 00:07:51.439 real 0m1.319s 00:07:51.439 user 0m0.006s 00:07:51.439 sys 0m0.000s 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:51.439 10:03:12 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:51.439 ************************************ 00:07:51.439 END TEST accel_copy_crc32c 00:07:51.439 ************************************ 00:07:51.439 10:03:13 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:51.439 10:03:13 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:51.439 10:03:13 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:51.439 10:03:13 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.439 ************************************ 00:07:51.439 START TEST accel_copy_crc32c_C2 00:07:51.439 ************************************ 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:51.439 [2024-06-10 10:03:13.094394] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:51.439 [2024-06-10 10:03:13.094455] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid927716 ] 00:07:51.439 [2024-06-10 10:03:13.185694] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.439 [2024-06-10 10:03:13.250761] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.439 10:03:13 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.819 00:07:52.819 real 0m1.319s 00:07:52.819 user 0m0.007s 00:07:52.819 sys 0m0.001s 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:52.819 10:03:14 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:52.819 ************************************ 00:07:52.819 END TEST accel_copy_crc32c_C2 00:07:52.819 ************************************ 00:07:52.819 10:03:14 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:52.819 10:03:14 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:52.819 10:03:14 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:52.819 10:03:14 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.819 ************************************ 00:07:52.819 START TEST accel_dualcast 00:07:52.819 ************************************ 00:07:52.819 10:03:14 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dualcast -y 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:52.819 10:03:14 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:52.819 [2024-06-10 10:03:14.483193] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:52.819 [2024-06-10 10:03:14.483257] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid928031 ] 00:07:52.819 [2024-06-10 10:03:14.572177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.819 [2024-06-10 10:03:14.647799] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:53.079 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.080 10:03:14 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.017 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:54.018 10:03:15 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.018 00:07:54.018 real 0m1.328s 00:07:54.018 user 0m0.006s 00:07:54.018 sys 0m0.000s 00:07:54.018 10:03:15 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:54.018 10:03:15 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:54.018 ************************************ 00:07:54.018 END TEST accel_dualcast 00:07:54.018 ************************************ 00:07:54.018 10:03:15 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:54.018 10:03:15 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:54.018 10:03:15 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:54.018 10:03:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.018 ************************************ 00:07:54.018 START TEST accel_compare 00:07:54.018 ************************************ 00:07:54.018 10:03:15 accel.accel_compare -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compare -y 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:54.018 10:03:15 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:54.018 [2024-06-10 10:03:15.880515] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:54.018 [2024-06-10 10:03:15.880574] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid928164 ] 00:07:54.277 [2024-06-10 10:03:15.969034] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.277 [2024-06-10 10:03:16.036709] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 10:03:16 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:55.658 10:03:17 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.658 00:07:55.658 real 0m1.325s 00:07:55.658 user 0m0.005s 00:07:55.658 sys 0m0.001s 00:07:55.658 10:03:17 accel.accel_compare -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:55.658 10:03:17 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:55.658 ************************************ 00:07:55.658 END TEST accel_compare 00:07:55.658 ************************************ 00:07:55.658 10:03:17 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:55.658 10:03:17 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:55.658 10:03:17 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:55.658 10:03:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.658 ************************************ 00:07:55.658 START TEST accel_xor 00:07:55.658 ************************************ 00:07:55.658 10:03:17 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:55.658 [2024-06-10 10:03:17.273005] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:55.658 [2024-06-10 10:03:17.273057] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid928394 ] 00:07:55.658 [2024-06-10 10:03:17.362537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.658 [2024-06-10 10:03:17.437189] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.658 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.659 10:03:17 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.659 10:03:17 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.659 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.659 10:03:17 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.041 00:07:57.041 real 0m1.326s 00:07:57.041 user 0m0.006s 00:07:57.041 sys 0m0.000s 00:07:57.041 10:03:18 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:57.041 10:03:18 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:57.041 ************************************ 00:07:57.041 END TEST accel_xor 00:07:57.041 ************************************ 00:07:57.041 10:03:18 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:57.041 10:03:18 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:57.041 10:03:18 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:57.041 10:03:18 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.041 ************************************ 00:07:57.041 START TEST accel_xor 00:07:57.041 ************************************ 00:07:57.041 10:03:18 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y -x 3 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:57.041 [2024-06-10 10:03:18.665187] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:57.041 [2024-06-10 10:03:18.665260] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid928706 ] 00:07:57.041 [2024-06-10 10:03:18.753045] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.041 [2024-06-10 10:03:18.824624] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.041 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.042 10:03:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:58.429 10:03:19 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:58.429 00:07:58.429 real 0m1.324s 00:07:58.429 user 0m0.004s 00:07:58.429 sys 0m0.002s 00:07:58.429 10:03:19 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:58.429 10:03:19 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:58.429 ************************************ 00:07:58.429 END TEST accel_xor 00:07:58.429 ************************************ 00:07:58.429 10:03:19 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:58.429 10:03:19 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:07:58.429 10:03:19 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:58.429 10:03:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.429 ************************************ 00:07:58.429 START TEST accel_dif_verify 00:07:58.429 ************************************ 00:07:58.429 10:03:20 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_verify 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:58.429 [2024-06-10 10:03:20.059580] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:58.429 [2024-06-10 10:03:20.059643] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid929029 ] 00:07:58.429 [2024-06-10 10:03:20.149007] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.429 [2024-06-10 10:03:20.218656] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.429 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.430 10:03:20 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.812 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:59.813 10:03:21 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:59.813 00:07:59.813 real 0m1.324s 00:07:59.813 user 0m0.005s 00:07:59.813 sys 0m0.001s 00:07:59.813 10:03:21 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:59.813 10:03:21 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:59.813 ************************************ 00:07:59.813 END TEST accel_dif_verify 00:07:59.813 ************************************ 00:07:59.813 10:03:21 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:59.813 10:03:21 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:07:59.813 10:03:21 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:59.813 10:03:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.813 ************************************ 00:07:59.813 START TEST accel_dif_generate 00:07:59.813 ************************************ 00:07:59.813 10:03:21 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:59.813 [2024-06-10 10:03:21.454809] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:07:59.813 [2024-06-10 10:03:21.454889] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid929242 ] 00:07:59.813 [2024-06-10 10:03:21.544285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.813 [2024-06-10 10:03:21.615609] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.813 10:03:21 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:01.202 10:03:22 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:01.202 00:08:01.202 real 0m1.325s 00:08:01.202 user 0m0.007s 00:08:01.202 sys 0m0.001s 00:08:01.202 10:03:22 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:01.202 10:03:22 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:01.202 ************************************ 00:08:01.202 END TEST accel_dif_generate 00:08:01.202 ************************************ 00:08:01.202 10:03:22 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:01.202 10:03:22 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:08:01.202 10:03:22 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:01.202 10:03:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.202 ************************************ 00:08:01.202 START TEST accel_dif_generate_copy 00:08:01.202 ************************************ 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate_copy 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:01.202 10:03:22 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:01.202 [2024-06-10 10:03:22.846832] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:01.202 [2024-06-10 10:03:22.846895] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid929399 ] 00:08:01.202 [2024-06-10 10:03:22.935536] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.202 [2024-06-10 10:03:23.006169] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.202 10:03:23 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.586 00:08:02.586 real 0m1.323s 00:08:02.586 user 0m0.005s 00:08:02.586 sys 0m0.000s 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:02.586 10:03:24 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:02.586 ************************************ 00:08:02.586 END TEST accel_dif_generate_copy 00:08:02.586 ************************************ 00:08:02.586 10:03:24 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:02.586 10:03:24 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.586 10:03:24 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:08:02.586 10:03:24 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:02.586 10:03:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.586 ************************************ 00:08:02.586 START TEST accel_comp 00:08:02.586 ************************************ 00:08:02.586 10:03:24 accel.accel_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:02.586 10:03:24 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:02.586 [2024-06-10 10:03:24.236723] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:02.586 [2024-06-10 10:03:24.236802] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid929705 ] 00:08:02.586 [2024-06-10 10:03:24.326053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.586 [2024-06-10 10:03:24.400350] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.847 10:03:24 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:03.795 10:03:25 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.795 00:08:03.795 real 0m1.338s 00:08:03.795 user 0m0.006s 00:08:03.795 sys 0m0.000s 00:08:03.795 10:03:25 accel.accel_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:03.795 10:03:25 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:03.795 ************************************ 00:08:03.795 END TEST accel_comp 00:08:03.795 ************************************ 00:08:03.795 10:03:25 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.795 10:03:25 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:08:03.795 10:03:25 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:03.795 10:03:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.795 ************************************ 00:08:03.795 START TEST accel_decomp 00:08:03.795 ************************************ 00:08:03.795 10:03:25 accel.accel_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:03.795 10:03:25 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:03.795 [2024-06-10 10:03:25.645726] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:03.795 [2024-06-10 10:03:25.645784] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid930024 ] 00:08:04.055 [2024-06-10 10:03:25.735012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.055 [2024-06-10 10:03:25.811075] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.055 10:03:25 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:05.436 10:03:26 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.436 00:08:05.436 real 0m1.334s 00:08:05.436 user 0m0.006s 00:08:05.436 sys 0m0.000s 00:08:05.436 10:03:26 accel.accel_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:05.436 10:03:26 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:05.436 ************************************ 00:08:05.436 END TEST accel_decomp 00:08:05.436 ************************************ 00:08:05.436 10:03:26 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.436 10:03:26 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:05.436 10:03:26 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:05.436 10:03:26 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.436 ************************************ 00:08:05.436 START TEST accel_decomp_full 00:08:05.436 ************************************ 00:08:05.437 10:03:27 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:05.437 [2024-06-10 10:03:27.047926] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:05.437 [2024-06-10 10:03:27.047988] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid930348 ] 00:08:05.437 [2024-06-10 10:03:27.145602] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.437 [2024-06-10 10:03:27.215196] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.437 10:03:27 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.818 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.819 10:03:28 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.819 00:08:06.819 real 0m1.339s 00:08:06.819 user 0m0.005s 00:08:06.819 sys 0m0.001s 00:08:06.819 10:03:28 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:06.819 10:03:28 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:06.819 ************************************ 00:08:06.819 END TEST accel_decomp_full 00:08:06.819 ************************************ 00:08:06.819 10:03:28 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.819 10:03:28 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:06.819 10:03:28 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:06.819 10:03:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.819 ************************************ 00:08:06.819 START TEST accel_decomp_mcore 00:08:06.819 ************************************ 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:06.819 [2024-06-10 10:03:28.455184] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:06.819 [2024-06-10 10:03:28.455250] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid930459 ] 00:08:06.819 [2024-06-10 10:03:28.543599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:06.819 [2024-06-10 10:03:28.616569] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:06.819 [2024-06-10 10:03:28.616681] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:06.819 [2024-06-10 10:03:28.616847] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:08:06.819 [2024-06-10 10:03:28.616850] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.819 10:03:28 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.292 00:08:08.292 real 0m1.339s 00:08:08.292 user 0m4.476s 00:08:08.292 sys 0m0.130s 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:08.292 10:03:29 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:08.292 ************************************ 00:08:08.292 END TEST accel_decomp_mcore 00:08:08.292 ************************************ 00:08:08.292 10:03:29 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.292 10:03:29 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:08.293 10:03:29 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:08.293 10:03:29 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.293 ************************************ 00:08:08.293 START TEST accel_decomp_full_mcore 00:08:08.293 ************************************ 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:08.293 10:03:29 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:08.293 [2024-06-10 10:03:29.872762] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:08.293 [2024-06-10 10:03:29.872832] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid930703 ] 00:08:08.293 [2024-06-10 10:03:29.961318] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:08.293 [2024-06-10 10:03:30.030276] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.293 [2024-06-10 10:03:30.030387] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:08.293 [2024-06-10 10:03:30.030537] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.293 [2024-06-10 10:03:30.030538] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.293 10:03:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.676 00:08:09.676 real 0m1.351s 00:08:09.676 user 0m4.522s 00:08:09.676 sys 0m0.132s 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:09.676 10:03:31 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:09.676 ************************************ 00:08:09.676 END TEST accel_decomp_full_mcore 00:08:09.676 ************************************ 00:08:09.676 10:03:31 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.676 10:03:31 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:09.676 10:03:31 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:09.676 10:03:31 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.676 ************************************ 00:08:09.676 START TEST accel_decomp_mthread 00:08:09.676 ************************************ 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:09.676 [2024-06-10 10:03:31.299043] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:09.676 [2024-06-10 10:03:31.299105] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid931035 ] 00:08:09.676 [2024-06-10 10:03:31.399835] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.676 [2024-06-10 10:03:31.475004] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.676 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.677 10:03:31 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.058 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.059 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.059 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.059 10:03:32 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.059 00:08:11.059 real 0m1.352s 00:08:11.059 user 0m1.213s 00:08:11.059 sys 0m0.141s 00:08:11.059 10:03:32 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:11.059 10:03:32 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:11.059 ************************************ 00:08:11.059 END TEST accel_decomp_mthread 00:08:11.059 ************************************ 00:08:11.059 10:03:32 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.059 10:03:32 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:11.059 10:03:32 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:11.059 10:03:32 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.059 ************************************ 00:08:11.059 START TEST accel_decomp_full_mthread 00:08:11.059 ************************************ 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:11.059 [2024-06-10 10:03:32.724173] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:11.059 [2024-06-10 10:03:32.724230] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid931350 ] 00:08:11.059 [2024-06-10 10:03:32.812341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.059 [2024-06-10 10:03:32.880486] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.059 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.318 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.318 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.318 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.319 10:03:32 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.258 00:08:12.258 real 0m1.357s 00:08:12.258 user 0m1.239s 00:08:12.258 sys 0m0.125s 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:12.258 10:03:34 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:12.258 ************************************ 00:08:12.258 END TEST accel_decomp_full_mthread 00:08:12.258 ************************************ 00:08:12.258 10:03:34 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:12.258 10:03:34 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:12.258 10:03:34 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:12.258 10:03:34 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:12.258 10:03:34 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=931595 00:08:12.258 10:03:34 accel -- accel/accel.sh@63 -- # waitforlisten 931595 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@830 -- # '[' -z 931595 ']' 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:12.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:12.258 10:03:34 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:12.258 10:03:34 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:12.258 10:03:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.258 10:03:34 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:12.258 10:03:34 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:12.258 10:03:34 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.258 10:03:34 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.258 10:03:34 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:12.258 10:03:34 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:12.258 10:03:34 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:12.258 10:03:34 accel -- accel/accel.sh@41 -- # jq -r . 00:08:12.518 [2024-06-10 10:03:34.146154] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:12.518 [2024-06-10 10:03:34.146208] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid931595 ] 00:08:12.518 [2024-06-10 10:03:34.234505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.518 [2024-06-10 10:03:34.297589] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.088 [2024-06-10 10:03:34.696255] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:13.349 10:03:34 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:13.349 10:03:34 accel -- common/autotest_common.sh@863 -- # return 0 00:08:13.349 10:03:34 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:13.349 10:03:34 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:13.349 10:03:34 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:13.349 10:03:34 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:13.349 10:03:34 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:13.349 10:03:34 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:13.349 10:03:34 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:13.349 10:03:34 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:13.349 10:03:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.349 10:03:34 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:13.349 "method": "compressdev_scan_accel_module", 00:08:13.349 10:03:35 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:13.349 10:03:35 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:13.349 10:03:35 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.349 10:03:35 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.349 10:03:35 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.349 10:03:35 accel -- accel/accel.sh@75 -- # killprocess 931595 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@949 -- # '[' -z 931595 ']' 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@953 -- # kill -0 931595 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@954 -- # uname 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:13.349 10:03:35 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 931595 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 931595' 00:08:13.610 killing process with pid 931595 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@968 -- # kill 931595 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@973 -- # wait 931595 00:08:13.610 10:03:35 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:13.610 10:03:35 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:13.610 10:03:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.896 ************************************ 00:08:13.896 START TEST accel_cdev_comp 00:08:13.896 ************************************ 00:08:13.896 10:03:35 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:13.896 10:03:35 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:13.896 [2024-06-10 10:03:35.508659] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:13.896 [2024-06-10 10:03:35.508725] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid931720 ] 00:08:13.896 [2024-06-10 10:03:35.596098] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.896 [2024-06-10 10:03:35.665600] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.466 [2024-06-10 10:03:36.061940] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:14.466 [2024-06-10 10:03:36.063714] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12cd330 PMD being used: compress_qat 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.466 [2024-06-10 10:03:36.066738] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14d2070 PMD being used: compress_qat 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.466 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.467 10:03:36 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.405 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:15.406 10:03:37 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:15.406 00:08:15.406 real 0m1.693s 00:08:15.406 user 0m1.405s 00:08:15.406 sys 0m0.290s 00:08:15.406 10:03:37 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:15.406 10:03:37 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:15.406 ************************************ 00:08:15.406 END TEST accel_cdev_comp 00:08:15.406 ************************************ 00:08:15.406 10:03:37 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.406 10:03:37 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:08:15.406 10:03:37 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:15.406 10:03:37 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.406 ************************************ 00:08:15.406 START TEST accel_cdev_decomp 00:08:15.406 ************************************ 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:15.406 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:15.666 [2024-06-10 10:03:37.276809] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:15.666 [2024-06-10 10:03:37.276880] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid932046 ] 00:08:15.666 [2024-06-10 10:03:37.366571] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.666 [2024-06-10 10:03:37.442654] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.235 [2024-06-10 10:03:37.845276] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:16.235 [2024-06-10 10:03:37.847009] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2133330 PMD being used: compress_qat 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 [2024-06-10 10:03:37.850149] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2338070 PMD being used: compress_qat 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.235 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.236 10:03:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:17.176 00:08:17.176 real 0m1.708s 00:08:17.176 user 0m1.400s 00:08:17.176 sys 0m0.305s 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:17.176 10:03:38 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:17.176 ************************************ 00:08:17.176 END TEST accel_cdev_decomp 00:08:17.176 ************************************ 00:08:17.176 10:03:38 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.176 10:03:38 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:17.176 10:03:38 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:17.176 10:03:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.176 ************************************ 00:08:17.176 START TEST accel_cdev_decomp_full 00:08:17.176 ************************************ 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:17.176 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:17.435 [2024-06-10 10:03:39.057721] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:17.435 [2024-06-10 10:03:39.057775] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid932365 ] 00:08:17.435 [2024-06-10 10:03:39.149224] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.435 [2024-06-10 10:03:39.224103] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.005 [2024-06-10 10:03:39.622777] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:18.005 [2024-06-10 10:03:39.624564] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcc0330 PMD being used: compress_qat 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 [2024-06-10 10:03:39.626788] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcc35f0 PMD being used: compress_qat 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.005 10:03:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:18.945 00:08:18.945 real 0m1.705s 00:08:18.945 user 0m1.421s 00:08:18.945 sys 0m0.287s 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:18.945 10:03:40 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:18.945 ************************************ 00:08:18.945 END TEST accel_cdev_decomp_full 00:08:18.945 ************************************ 00:08:18.945 10:03:40 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:18.945 10:03:40 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:18.945 10:03:40 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:18.945 10:03:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:18.945 ************************************ 00:08:18.945 START TEST accel_cdev_decomp_mcore 00:08:18.945 ************************************ 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:18.945 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:19.205 10:03:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:19.205 [2024-06-10 10:03:40.837368] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:19.205 [2024-06-10 10:03:40.837429] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid932692 ] 00:08:19.205 [2024-06-10 10:03:40.928575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:19.205 [2024-06-10 10:03:41.005815] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:19.205 [2024-06-10 10:03:41.005927] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:19.205 [2024-06-10 10:03:41.005993] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.205 [2024-06-10 10:03:41.005993] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:08:19.775 [2024-06-10 10:03:41.400575] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:19.776 [2024-06-10 10:03:41.402327] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10f6950 PMD being used: compress_qat 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 [2024-06-10 10:03:41.406491] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2fa019b8b0 PMD being used: compress_qat 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 [2024-06-10 10:03:41.407383] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2f9819b8b0 PMD being used: compress_qat 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 [2024-06-10 10:03:41.408417] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10fbe40 PMD being used: compress_qat 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 [2024-06-10 10:03:41.408481] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2f9019b8b0 PMD being used: compress_qat 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.776 10:03:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:20.715 00:08:20.715 real 0m1.715s 00:08:20.715 user 0m5.776s 00:08:20.715 sys 0m0.297s 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:20.715 10:03:42 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:20.715 ************************************ 00:08:20.715 END TEST accel_cdev_decomp_mcore 00:08:20.715 ************************************ 00:08:20.715 10:03:42 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:20.715 10:03:42 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:20.715 10:03:42 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:20.715 10:03:42 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.975 ************************************ 00:08:20.975 START TEST accel_cdev_decomp_full_mcore 00:08:20.975 ************************************ 00:08:20.975 10:03:42 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:20.975 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:20.975 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:20.975 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:20.976 10:03:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:20.976 [2024-06-10 10:03:42.627346] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:20.976 [2024-06-10 10:03:42.627407] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid933015 ] 00:08:20.976 [2024-06-10 10:03:42.717357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:20.976 [2024-06-10 10:03:42.795853] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:20.976 [2024-06-10 10:03:42.795970] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:20.976 [2024-06-10 10:03:42.796127] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.976 [2024-06-10 10:03:42.796127] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:08:21.547 [2024-06-10 10:03:43.196904] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:21.547 [2024-06-10 10:03:43.198662] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbee950 PMD being used: compress_qat 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 [2024-06-10 10:03:43.201876] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f056019b8b0 PMD being used: compress_qat 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 [2024-06-10 10:03:43.202737] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f055819b8b0 PMD being used: compress_qat 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 [2024-06-10 10:03:43.203589] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbee9f0 PMD being used: compress_qat 00:08:21.547 [2024-06-10 10:03:43.203672] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f055019b8b0 PMD being used: compress_qat 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.547 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.548 10:03:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.487 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:22.488 00:08:22.488 real 0m1.721s 00:08:22.488 user 0m5.801s 00:08:22.488 sys 0m0.291s 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:22.488 10:03:44 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:22.488 ************************************ 00:08:22.488 END TEST accel_cdev_decomp_full_mcore 00:08:22.488 ************************************ 00:08:22.488 10:03:44 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:22.488 10:03:44 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:22.488 10:03:44 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:22.488 10:03:44 accel -- common/autotest_common.sh@10 -- # set +x 00:08:22.747 ************************************ 00:08:22.747 START TEST accel_cdev_decomp_mthread 00:08:22.747 ************************************ 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:22.747 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:22.747 [2024-06-10 10:03:44.418752] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:22.747 [2024-06-10 10:03:44.418817] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid933339 ] 00:08:22.747 [2024-06-10 10:03:44.518089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.747 [2024-06-10 10:03:44.593696] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.318 [2024-06-10 10:03:44.987655] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:23.318 [2024-06-10 10:03:44.989418] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2421330 PMD being used: compress_qat 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 [2024-06-10 10:03:44.992721] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2626070 PMD being used: compress_qat 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 [2024-06-10 10:03:44.994458] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2548ec0 PMD being used: compress_qat 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.318 10:03:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:24.257 00:08:24.257 real 0m1.709s 00:08:24.257 user 0m1.415s 00:08:24.257 sys 0m0.297s 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:24.257 10:03:46 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:24.257 ************************************ 00:08:24.257 END TEST accel_cdev_decomp_mthread 00:08:24.257 ************************************ 00:08:24.517 10:03:46 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.517 10:03:46 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:24.517 10:03:46 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:24.517 10:03:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.517 ************************************ 00:08:24.517 START TEST accel_cdev_decomp_full_mthread 00:08:24.517 ************************************ 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:24.517 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:24.517 [2024-06-10 10:03:46.208496] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:24.517 [2024-06-10 10:03:46.208596] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid933659 ] 00:08:24.517 [2024-06-10 10:03:46.303143] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.517 [2024-06-10 10:03:46.378055] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.087 [2024-06-10 10:03:46.775784] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:25.087 [2024-06-10 10:03:46.777535] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb20330 PMD being used: compress_qat 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 [2024-06-10 10:03:46.780017] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb235f0 PMD being used: compress_qat 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 [2024-06-10 10:03:46.782005] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd0aa20 PMD being used: compress_qat 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.087 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.088 10:03:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.026 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:26.027 00:08:26.027 real 0m1.716s 00:08:26.027 user 0m1.419s 00:08:26.027 sys 0m0.299s 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:26.027 10:03:47 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:26.027 ************************************ 00:08:26.027 END TEST accel_cdev_decomp_full_mthread 00:08:26.027 ************************************ 00:08:26.287 10:03:47 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:26.287 10:03:47 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:26.287 10:03:47 accel -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:08:26.287 10:03:47 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:26.287 10:03:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:26.287 10:03:47 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:26.287 10:03:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.287 10:03:47 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:26.287 10:03:47 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.287 10:03:47 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.287 10:03:47 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:26.287 10:03:47 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:26.287 10:03:47 accel -- accel/accel.sh@41 -- # jq -r . 00:08:26.287 ************************************ 00:08:26.287 START TEST accel_dif_functional_tests 00:08:26.287 ************************************ 00:08:26.287 10:03:47 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:26.287 [2024-06-10 10:03:48.018928] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:26.287 [2024-06-10 10:03:48.018976] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid933984 ] 00:08:26.287 [2024-06-10 10:03:48.106227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:26.547 [2024-06-10 10:03:48.181981] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:26.547 [2024-06-10 10:03:48.182152] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:26.547 [2024-06-10 10:03:48.182156] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.547 00:08:26.547 00:08:26.547 CUnit - A unit testing framework for C - Version 2.1-3 00:08:26.547 http://cunit.sourceforge.net/ 00:08:26.547 00:08:26.547 00:08:26.547 Suite: accel_dif 00:08:26.547 Test: verify: DIF generated, GUARD check ...passed 00:08:26.547 Test: verify: DIF generated, APPTAG check ...passed 00:08:26.547 Test: verify: DIF generated, REFTAG check ...passed 00:08:26.547 Test: verify: DIF not generated, GUARD check ...[2024-06-10 10:03:48.248090] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:26.547 passed 00:08:26.547 Test: verify: DIF not generated, APPTAG check ...[2024-06-10 10:03:48.248132] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:26.547 passed 00:08:26.547 Test: verify: DIF not generated, REFTAG check ...[2024-06-10 10:03:48.248151] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:26.547 passed 00:08:26.547 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:26.547 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-10 10:03:48.248196] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:26.547 passed 00:08:26.547 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:26.547 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:26.547 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:26.547 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-10 10:03:48.248304] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:26.547 passed 00:08:26.547 Test: verify copy: DIF generated, GUARD check ...passed 00:08:26.547 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:26.547 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:26.547 Test: verify copy: DIF not generated, GUARD check ...[2024-06-10 10:03:48.248425] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:26.547 passed 00:08:26.547 Test: verify copy: DIF not generated, APPTAG check ...[2024-06-10 10:03:48.248447] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:26.547 passed 00:08:26.547 Test: verify copy: DIF not generated, REFTAG check ...[2024-06-10 10:03:48.248468] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:26.547 passed 00:08:26.547 Test: generate copy: DIF generated, GUARD check ...passed 00:08:26.547 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:26.547 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:26.547 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:26.547 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:26.547 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:26.547 Test: generate copy: iovecs-len validate ...[2024-06-10 10:03:48.248646] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:26.547 passed 00:08:26.547 Test: generate copy: buffer alignment validate ...passed 00:08:26.547 00:08:26.547 Run Summary: Type Total Ran Passed Failed Inactive 00:08:26.547 suites 1 1 n/a 0 0 00:08:26.547 tests 26 26 26 0 0 00:08:26.548 asserts 115 115 115 0 n/a 00:08:26.548 00:08:26.548 Elapsed time = 0.002 seconds 00:08:26.548 00:08:26.548 real 0m0.397s 00:08:26.548 user 0m0.516s 00:08:26.548 sys 0m0.150s 00:08:26.548 10:03:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:26.548 10:03:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:26.548 ************************************ 00:08:26.548 END TEST accel_dif_functional_tests 00:08:26.548 ************************************ 00:08:26.548 00:08:26.548 real 0m44.766s 00:08:26.548 user 0m53.992s 00:08:26.548 sys 0m7.511s 00:08:26.548 10:03:48 accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:26.548 10:03:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.548 ************************************ 00:08:26.548 END TEST accel 00:08:26.548 ************************************ 00:08:26.808 10:03:48 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:26.808 10:03:48 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:08:26.808 10:03:48 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:26.808 10:03:48 -- common/autotest_common.sh@10 -- # set +x 00:08:26.808 ************************************ 00:08:26.808 START TEST accel_rpc 00:08:26.808 ************************************ 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:26.808 * Looking for test storage... 00:08:26.808 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:26.808 10:03:48 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:26.808 10:03:48 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=934226 00:08:26.808 10:03:48 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 934226 00:08:26.808 10:03:48 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@830 -- # '[' -z 934226 ']' 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:26.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:26.808 10:03:48 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:26.808 [2024-06-10 10:03:48.643576] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:26.808 [2024-06-10 10:03:48.643643] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid934226 ] 00:08:27.068 [2024-06-10 10:03:48.737431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.068 [2024-06-10 10:03:48.806492] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.636 10:03:49 accel_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:27.636 10:03:49 accel_rpc -- common/autotest_common.sh@863 -- # return 0 00:08:27.636 10:03:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:27.636 10:03:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:27.636 10:03:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:27.636 10:03:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:27.636 10:03:49 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:27.636 10:03:49 accel_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:08:27.636 10:03:49 accel_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:27.636 10:03:49 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.636 ************************************ 00:08:27.636 START TEST accel_assign_opcode 00:08:27.636 ************************************ 00:08:27.636 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # accel_assign_opcode_test_suite 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:27.896 [2024-06-10 10:03:49.508553] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:27.896 [2024-06-10 10:03:49.520577] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:27.896 software 00:08:27.896 00:08:27.896 real 0m0.220s 00:08:27.896 user 0m0.050s 00:08:27.896 sys 0m0.011s 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:27.896 10:03:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:27.896 ************************************ 00:08:27.896 END TEST accel_assign_opcode 00:08:27.896 ************************************ 00:08:27.896 10:03:49 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 934226 00:08:27.896 10:03:49 accel_rpc -- common/autotest_common.sh@949 -- # '[' -z 934226 ']' 00:08:27.896 10:03:49 accel_rpc -- common/autotest_common.sh@953 -- # kill -0 934226 00:08:27.896 10:03:49 accel_rpc -- common/autotest_common.sh@954 -- # uname 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 934226 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 934226' 00:08:28.155 killing process with pid 934226 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@968 -- # kill 934226 00:08:28.155 10:03:49 accel_rpc -- common/autotest_common.sh@973 -- # wait 934226 00:08:28.155 00:08:28.155 real 0m1.542s 00:08:28.155 user 0m1.649s 00:08:28.155 sys 0m0.439s 00:08:28.155 10:03:50 accel_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:28.155 10:03:50 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:28.155 ************************************ 00:08:28.155 END TEST accel_rpc 00:08:28.155 ************************************ 00:08:28.416 10:03:50 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:28.416 10:03:50 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:08:28.416 10:03:50 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:28.416 10:03:50 -- common/autotest_common.sh@10 -- # set +x 00:08:28.416 ************************************ 00:08:28.416 START TEST app_cmdline 00:08:28.416 ************************************ 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:28.416 * Looking for test storage... 00:08:28.416 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:28.416 10:03:50 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:28.416 10:03:50 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=934501 00:08:28.416 10:03:50 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 934501 00:08:28.416 10:03:50 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@830 -- # '[' -z 934501 ']' 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:28.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:28.416 10:03:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:28.416 [2024-06-10 10:03:50.257478] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:28.416 [2024-06-10 10:03:50.257548] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid934501 ] 00:08:28.677 [2024-06-10 10:03:50.357295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.677 [2024-06-10 10:03:50.434959] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.246 10:03:51 app_cmdline -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:29.246 10:03:51 app_cmdline -- common/autotest_common.sh@863 -- # return 0 00:08:29.246 10:03:51 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:29.506 { 00:08:29.506 "version": "SPDK v24.09-pre git sha1 3a44739b7", 00:08:29.506 "fields": { 00:08:29.506 "major": 24, 00:08:29.506 "minor": 9, 00:08:29.506 "patch": 0, 00:08:29.506 "suffix": "-pre", 00:08:29.506 "commit": "3a44739b7" 00:08:29.506 } 00:08:29.506 } 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:29.506 10:03:51 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@649 -- # local es=0 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:29.506 10:03:51 app_cmdline -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.766 request: 00:08:29.766 { 00:08:29.766 "method": "env_dpdk_get_mem_stats", 00:08:29.766 "req_id": 1 00:08:29.766 } 00:08:29.766 Got JSON-RPC error response 00:08:29.766 response: 00:08:29.766 { 00:08:29.766 "code": -32601, 00:08:29.766 "message": "Method not found" 00:08:29.766 } 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@652 -- # es=1 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:08:29.766 10:03:51 app_cmdline -- app/cmdline.sh@1 -- # killprocess 934501 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@949 -- # '[' -z 934501 ']' 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@953 -- # kill -0 934501 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@954 -- # uname 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 934501 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@967 -- # echo 'killing process with pid 934501' 00:08:29.766 killing process with pid 934501 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@968 -- # kill 934501 00:08:29.766 10:03:51 app_cmdline -- common/autotest_common.sh@973 -- # wait 934501 00:08:30.026 00:08:30.026 real 0m1.636s 00:08:30.026 user 0m2.009s 00:08:30.026 sys 0m0.399s 00:08:30.026 10:03:51 app_cmdline -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:30.026 10:03:51 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:30.026 ************************************ 00:08:30.026 END TEST app_cmdline 00:08:30.026 ************************************ 00:08:30.026 10:03:51 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:30.026 10:03:51 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:08:30.026 10:03:51 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:30.026 10:03:51 -- common/autotest_common.sh@10 -- # set +x 00:08:30.026 ************************************ 00:08:30.026 START TEST version 00:08:30.026 ************************************ 00:08:30.026 10:03:51 version -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:30.026 * Looking for test storage... 00:08:30.287 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:30.287 10:03:51 version -- app/version.sh@17 -- # get_header_version major 00:08:30.287 10:03:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # cut -f2 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:30.287 10:03:51 version -- app/version.sh@17 -- # major=24 00:08:30.287 10:03:51 version -- app/version.sh@18 -- # get_header_version minor 00:08:30.287 10:03:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # cut -f2 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:30.287 10:03:51 version -- app/version.sh@18 -- # minor=9 00:08:30.287 10:03:51 version -- app/version.sh@19 -- # get_header_version patch 00:08:30.287 10:03:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # cut -f2 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:30.287 10:03:51 version -- app/version.sh@19 -- # patch=0 00:08:30.287 10:03:51 version -- app/version.sh@20 -- # get_header_version suffix 00:08:30.287 10:03:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # cut -f2 00:08:30.287 10:03:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:30.287 10:03:51 version -- app/version.sh@20 -- # suffix=-pre 00:08:30.287 10:03:51 version -- app/version.sh@22 -- # version=24.9 00:08:30.287 10:03:51 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:30.287 10:03:51 version -- app/version.sh@28 -- # version=24.9rc0 00:08:30.287 10:03:51 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:30.287 10:03:51 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:30.287 10:03:51 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:30.287 10:03:51 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:30.287 00:08:30.287 real 0m0.183s 00:08:30.287 user 0m0.092s 00:08:30.287 sys 0m0.134s 00:08:30.287 10:03:51 version -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:30.287 10:03:51 version -- common/autotest_common.sh@10 -- # set +x 00:08:30.287 ************************************ 00:08:30.287 END TEST version 00:08:30.287 ************************************ 00:08:30.287 10:03:52 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:30.287 10:03:52 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:30.287 10:03:52 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:08:30.287 10:03:52 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:30.287 10:03:52 -- common/autotest_common.sh@10 -- # set +x 00:08:30.287 ************************************ 00:08:30.287 START TEST blockdev_general 00:08:30.287 ************************************ 00:08:30.287 10:03:52 blockdev_general -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:30.547 * Looking for test storage... 00:08:30.547 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:30.547 10:03:52 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=934925 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 934925 00:08:30.547 10:03:52 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@830 -- # '[' -z 934925 ']' 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:30.547 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:30.547 10:03:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:30.547 [2024-06-10 10:03:52.246989] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:30.547 [2024-06-10 10:03:52.247054] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid934925 ] 00:08:30.547 [2024-06-10 10:03:52.338088] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.547 [2024-06-10 10:03:52.405753] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.516 10:03:53 blockdev_general -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:31.516 10:03:53 blockdev_general -- common/autotest_common.sh@863 -- # return 0 00:08:31.516 10:03:53 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:31.516 10:03:53 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:31.516 10:03:53 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:31.516 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.516 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.516 [2024-06-10 10:03:53.245399] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:31.516 [2024-06-10 10:03:53.245438] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:31.516 00:08:31.516 [2024-06-10 10:03:53.253389] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:31.516 [2024-06-10 10:03:53.253405] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:31.516 00:08:31.516 Malloc0 00:08:31.516 Malloc1 00:08:31.516 Malloc2 00:08:31.516 Malloc3 00:08:31.516 Malloc4 00:08:31.516 Malloc5 00:08:31.516 Malloc6 00:08:31.516 Malloc7 00:08:31.516 Malloc8 00:08:31.516 Malloc9 00:08:31.516 [2024-06-10 10:03:53.362368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:31.516 [2024-06-10 10:03:53.362402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:31.516 [2024-06-10 10:03:53.362414] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed9e30 00:08:31.516 [2024-06-10 10:03:53.362421] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:31.516 [2024-06-10 10:03:53.363556] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:31.516 [2024-06-10 10:03:53.363574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:31.516 TestPT 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:31.792 5000+0 records in 00:08:31.792 5000+0 records out 00:08:31.792 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0164641 s, 622 MB/s 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 AIO0 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.792 10:03:53 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:31.792 10:03:53 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:31.794 10:03:53 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "6ce5d5a1-a22e-4fee-8fb1-e879789eec27"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6ce5d5a1-a22e-4fee-8fb1-e879789eec27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "011f1cf8-1435-595b-9b6b-16399b73cc50"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "011f1cf8-1435-595b-9b6b-16399b73cc50",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "758354bf-1639-5fee-820d-2f1ba8824452"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "758354bf-1639-5fee-820d-2f1ba8824452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5d76671c-876e-5564-b40f-145b1b49403b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5d76671c-876e-5564-b40f-145b1b49403b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "ba640bda-2909-5826-8776-c91c878faba8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ba640bda-2909-5826-8776-c91c878faba8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5230c032-14ef-599e-b99f-21a574b35c58"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5230c032-14ef-599e-b99f-21a574b35c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b6e401c9-4756-57d7-85c6-458d93dcf83d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b6e401c9-4756-57d7-85c6-458d93dcf83d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "535872ac-1714-53bf-9241-0ee51d267f88"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "535872ac-1714-53bf-9241-0ee51d267f88",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0d81cef9-a3dc-5d43-8077-99105869160d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0d81cef9-a3dc-5d43-8077-99105869160d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "9ce701f8-5e0c-5de7-9c99-b93624485529"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ce701f8-5e0c-5de7-9c99-b93624485529",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4c32242-71df-555a-a60f-bedfbec7e637"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4c32242-71df-555a-a60f-bedfbec7e637",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "c6b4156b-a1df-45af-af71-67b707d91d8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8fc2b5c7-b2fa-41a2-b433-3f026e501075",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0ccf722e-169d-4cc5-bd37-fe852a959dbe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "aab36a6c-e602-4fa3-ba50-b01b5e69cd2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a793b199-9798-422a-a354-8d8361eeb95f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c8f91d05-d93a-4357-8c91-a25130f4d7d6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "755195b0-8bc5-4b3b-ba96-28b91490983d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "157b73e3-5dff-4676-afb9-f09f9ffddc83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:32.053 10:03:53 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:32.053 10:03:53 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:32.053 10:03:53 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:32.053 10:03:53 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 934925 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@949 -- # '[' -z 934925 ']' 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@953 -- # kill -0 934925 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@954 -- # uname 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 934925 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@967 -- # echo 'killing process with pid 934925' 00:08:32.053 killing process with pid 934925 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@968 -- # kill 934925 00:08:32.053 10:03:53 blockdev_general -- common/autotest_common.sh@973 -- # wait 934925 00:08:32.314 10:03:54 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:32.314 10:03:54 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:32.314 10:03:54 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:08:32.314 10:03:54 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:32.314 10:03:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:32.314 ************************************ 00:08:32.314 START TEST bdev_hello_world 00:08:32.314 ************************************ 00:08:32.314 10:03:54 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:32.314 [2024-06-10 10:03:54.099876] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:32.314 [2024-06-10 10:03:54.099921] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid935277 ] 00:08:32.574 [2024-06-10 10:03:54.187494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.574 [2024-06-10 10:03:54.258309] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.574 [2024-06-10 10:03:54.386713] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:32.574 [2024-06-10 10:03:54.386758] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:32.574 [2024-06-10 10:03:54.386766] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:32.574 [2024-06-10 10:03:54.394719] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:32.574 [2024-06-10 10:03:54.394737] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:32.574 [2024-06-10 10:03:54.402731] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:32.574 [2024-06-10 10:03:54.402748] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:32.834 [2024-06-10 10:03:54.463515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:32.834 [2024-06-10 10:03:54.463552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:32.834 [2024-06-10 10:03:54.463561] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1489160 00:08:32.834 [2024-06-10 10:03:54.463567] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:32.834 [2024-06-10 10:03:54.464736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:32.834 [2024-06-10 10:03:54.464755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:32.834 [2024-06-10 10:03:54.586137] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:32.834 [2024-06-10 10:03:54.586170] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:32.834 [2024-06-10 10:03:54.586192] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:32.834 [2024-06-10 10:03:54.586226] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:32.834 [2024-06-10 10:03:54.586262] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:32.834 [2024-06-10 10:03:54.586273] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:32.834 [2024-06-10 10:03:54.586299] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:32.834 00:08:32.834 [2024-06-10 10:03:54.586314] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:33.095 00:08:33.095 real 0m0.719s 00:08:33.095 user 0m0.488s 00:08:33.095 sys 0m0.191s 00:08:33.095 10:03:54 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:33.095 10:03:54 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:33.095 ************************************ 00:08:33.095 END TEST bdev_hello_world 00:08:33.095 ************************************ 00:08:33.095 10:03:54 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:33.095 10:03:54 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:33.095 10:03:54 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:33.095 10:03:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.095 ************************************ 00:08:33.095 START TEST bdev_bounds 00:08:33.095 ************************************ 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=935502 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 935502' 00:08:33.095 Process bdevio pid: 935502 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 935502 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 935502 ']' 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:33.095 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:33.095 10:03:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:33.095 [2024-06-10 10:03:54.887033] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:33.095 [2024-06-10 10:03:54.887081] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid935502 ] 00:08:33.356 [2024-06-10 10:03:54.975518] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:33.356 [2024-06-10 10:03:55.042612] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:33.356 [2024-06-10 10:03:55.042721] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:08:33.356 [2024-06-10 10:03:55.042724] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.356 [2024-06-10 10:03:55.160584] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:33.356 [2024-06-10 10:03:55.160622] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:33.356 [2024-06-10 10:03:55.160631] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:33.356 [2024-06-10 10:03:55.168595] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:33.356 [2024-06-10 10:03:55.168614] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:33.356 [2024-06-10 10:03:55.176609] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:33.356 [2024-06-10 10:03:55.176626] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:33.616 [2024-06-10 10:03:55.237374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:33.616 [2024-06-10 10:03:55.237412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:33.616 [2024-06-10 10:03:55.237421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2af59e0 00:08:33.616 [2024-06-10 10:03:55.237427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:33.616 [2024-06-10 10:03:55.238684] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:33.616 [2024-06-10 10:03:55.238703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:33.876 10:03:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:33.876 10:03:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:08:33.876 10:03:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:34.137 I/O targets: 00:08:34.137 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:34.137 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:34.137 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:34.137 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:34.137 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:34.137 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:34.137 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:34.137 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:34.137 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:34.137 00:08:34.137 00:08:34.137 CUnit - A unit testing framework for C - Version 2.1-3 00:08:34.137 http://cunit.sourceforge.net/ 00:08:34.137 00:08:34.137 00:08:34.137 Suite: bdevio tests on: AIO0 00:08:34.137 Test: blockdev write read block ...passed 00:08:34.137 Test: blockdev write zeroes read block ...passed 00:08:34.137 Test: blockdev write zeroes read no split ...passed 00:08:34.137 Test: blockdev write zeroes read split ...passed 00:08:34.137 Test: blockdev write zeroes read split partial ...passed 00:08:34.137 Test: blockdev reset ...passed 00:08:34.137 Test: blockdev write read 8 blocks ...passed 00:08:34.137 Test: blockdev write read size > 128k ...passed 00:08:34.137 Test: blockdev write read invalid size ...passed 00:08:34.137 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.137 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.137 Test: blockdev write read max offset ...passed 00:08:34.137 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.137 Test: blockdev writev readv 8 blocks ...passed 00:08:34.137 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.137 Test: blockdev writev readv block ...passed 00:08:34.137 Test: blockdev writev readv size > 128k ...passed 00:08:34.137 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.137 Test: blockdev comparev and writev ...passed 00:08:34.137 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: raid1 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: concat0 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: raid0 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: TestPT 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: Malloc2p7 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: Malloc2p6 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: Malloc2p5 00:08:34.138 Test: blockdev write read block ...passed 00:08:34.138 Test: blockdev write zeroes read block ...passed 00:08:34.138 Test: blockdev write zeroes read no split ...passed 00:08:34.138 Test: blockdev write zeroes read split ...passed 00:08:34.138 Test: blockdev write zeroes read split partial ...passed 00:08:34.138 Test: blockdev reset ...passed 00:08:34.138 Test: blockdev write read 8 blocks ...passed 00:08:34.138 Test: blockdev write read size > 128k ...passed 00:08:34.138 Test: blockdev write read invalid size ...passed 00:08:34.138 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.138 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.138 Test: blockdev write read max offset ...passed 00:08:34.138 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.138 Test: blockdev writev readv 8 blocks ...passed 00:08:34.138 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.138 Test: blockdev writev readv block ...passed 00:08:34.138 Test: blockdev writev readv size > 128k ...passed 00:08:34.138 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.138 Test: blockdev comparev and writev ...passed 00:08:34.138 Test: blockdev nvme passthru rw ...passed 00:08:34.138 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.138 Test: blockdev nvme admin passthru ...passed 00:08:34.138 Test: blockdev copy ...passed 00:08:34.138 Suite: bdevio tests on: Malloc2p4 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc2p3 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc2p2 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc2p1 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc2p0 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc1p1 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.139 Test: blockdev write zeroes read split ...passed 00:08:34.139 Test: blockdev write zeroes read split partial ...passed 00:08:34.139 Test: blockdev reset ...passed 00:08:34.139 Test: blockdev write read 8 blocks ...passed 00:08:34.139 Test: blockdev write read size > 128k ...passed 00:08:34.139 Test: blockdev write read invalid size ...passed 00:08:34.139 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.139 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.139 Test: blockdev write read max offset ...passed 00:08:34.139 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.139 Test: blockdev writev readv 8 blocks ...passed 00:08:34.139 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.139 Test: blockdev writev readv block ...passed 00:08:34.139 Test: blockdev writev readv size > 128k ...passed 00:08:34.139 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.139 Test: blockdev comparev and writev ...passed 00:08:34.139 Test: blockdev nvme passthru rw ...passed 00:08:34.139 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.139 Test: blockdev nvme admin passthru ...passed 00:08:34.139 Test: blockdev copy ...passed 00:08:34.139 Suite: bdevio tests on: Malloc1p0 00:08:34.139 Test: blockdev write read block ...passed 00:08:34.139 Test: blockdev write zeroes read block ...passed 00:08:34.139 Test: blockdev write zeroes read no split ...passed 00:08:34.400 Test: blockdev write zeroes read split ...passed 00:08:34.400 Test: blockdev write zeroes read split partial ...passed 00:08:34.400 Test: blockdev reset ...passed 00:08:34.400 Test: blockdev write read 8 blocks ...passed 00:08:34.400 Test: blockdev write read size > 128k ...passed 00:08:34.400 Test: blockdev write read invalid size ...passed 00:08:34.400 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.400 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.400 Test: blockdev write read max offset ...passed 00:08:34.400 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.400 Test: blockdev writev readv 8 blocks ...passed 00:08:34.400 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.400 Test: blockdev writev readv block ...passed 00:08:34.400 Test: blockdev writev readv size > 128k ...passed 00:08:34.400 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.400 Test: blockdev comparev and writev ...passed 00:08:34.400 Test: blockdev nvme passthru rw ...passed 00:08:34.400 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.400 Test: blockdev nvme admin passthru ...passed 00:08:34.400 Test: blockdev copy ...passed 00:08:34.400 Suite: bdevio tests on: Malloc0 00:08:34.400 Test: blockdev write read block ...passed 00:08:34.400 Test: blockdev write zeroes read block ...passed 00:08:34.400 Test: blockdev write zeroes read no split ...passed 00:08:34.400 Test: blockdev write zeroes read split ...passed 00:08:34.400 Test: blockdev write zeroes read split partial ...passed 00:08:34.400 Test: blockdev reset ...passed 00:08:34.400 Test: blockdev write read 8 blocks ...passed 00:08:34.400 Test: blockdev write read size > 128k ...passed 00:08:34.400 Test: blockdev write read invalid size ...passed 00:08:34.400 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:34.400 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:34.400 Test: blockdev write read max offset ...passed 00:08:34.400 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:34.400 Test: blockdev writev readv 8 blocks ...passed 00:08:34.400 Test: blockdev writev readv 30 x 1block ...passed 00:08:34.401 Test: blockdev writev readv block ...passed 00:08:34.401 Test: blockdev writev readv size > 128k ...passed 00:08:34.401 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:34.401 Test: blockdev comparev and writev ...passed 00:08:34.401 Test: blockdev nvme passthru rw ...passed 00:08:34.401 Test: blockdev nvme passthru vendor specific ...passed 00:08:34.401 Test: blockdev nvme admin passthru ...passed 00:08:34.401 Test: blockdev copy ...passed 00:08:34.401 00:08:34.401 Run Summary: Type Total Ran Passed Failed Inactive 00:08:34.401 suites 16 16 n/a 0 0 00:08:34.401 tests 368 368 368 0 0 00:08:34.401 asserts 2224 2224 2224 0 n/a 00:08:34.401 00:08:34.401 Elapsed time = 0.441 seconds 00:08:34.401 0 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 935502 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 935502 ']' 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 935502 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 935502 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 935502' 00:08:34.401 killing process with pid 935502 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # kill 935502 00:08:34.401 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@973 -- # wait 935502 00:08:34.661 10:03:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:34.661 00:08:34.661 real 0m1.439s 00:08:34.661 user 0m3.778s 00:08:34.661 sys 0m0.338s 00:08:34.661 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:34.661 10:03:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:34.661 ************************************ 00:08:34.661 END TEST bdev_bounds 00:08:34.661 ************************************ 00:08:34.661 10:03:56 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:34.661 10:03:56 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:34.661 10:03:56 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:34.661 10:03:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:34.661 ************************************ 00:08:34.661 START TEST bdev_nbd 00:08:34.661 ************************************ 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=935836 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 935836 /var/tmp/spdk-nbd.sock 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 935836 ']' 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:34.661 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:34.662 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:34.662 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:34.662 10:03:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:34.662 [2024-06-10 10:03:56.417170] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:08:34.662 [2024-06-10 10:03:56.417214] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:34.662 [2024-06-10 10:03:56.506608] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.921 [2024-06-10 10:03:56.571489] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.921 [2024-06-10 10:03:56.687983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.921 [2024-06-10 10:03:56.688025] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:34.921 [2024-06-10 10:03:56.688033] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:34.921 [2024-06-10 10:03:56.695991] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.921 [2024-06-10 10:03:56.696009] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.921 [2024-06-10 10:03:56.704004] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.921 [2024-06-10 10:03:56.704020] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.921 [2024-06-10 10:03:56.764495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.921 [2024-06-10 10:03:56.764530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:34.921 [2024-06-10 10:03:56.764539] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2899a90 00:08:34.921 [2024-06-10 10:03:56.764546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:34.921 [2024-06-10 10:03:56.765676] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:34.921 [2024-06-10 10:03:56.765695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:35.491 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:35.491 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:35.492 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:35.752 1+0 records in 00:08:35.752 1+0 records out 00:08:35.752 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298287 s, 13.7 MB/s 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:35.752 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:35.753 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.013 1+0 records in 00:08:36.013 1+0 records out 00:08:36.013 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028521 s, 14.4 MB/s 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.013 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.274 1+0 records in 00:08:36.274 1+0 records out 00:08:36.274 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284795 s, 14.4 MB/s 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.274 10:03:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:36.274 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:36.274 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:36.275 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.535 1+0 records in 00:08:36.535 1+0 records out 00:08:36.535 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025181 s, 16.3 MB/s 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.535 1+0 records in 00:08:36.535 1+0 records out 00:08:36.535 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025122 s, 16.3 MB/s 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.535 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:36.797 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.798 1+0 records in 00:08:36.798 1+0 records out 00:08:36.798 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295498 s, 13.9 MB/s 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.798 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.058 1+0 records in 00:08:37.058 1+0 records out 00:08:37.058 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379574 s, 10.8 MB/s 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.058 10:03:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.317 1+0 records in 00:08:37.317 1+0 records out 00:08:37.317 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238188 s, 17.2 MB/s 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.317 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.577 1+0 records in 00:08:37.577 1+0 records out 00:08:37.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000394438 s, 10.4 MB/s 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.577 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.838 1+0 records in 00:08:37.838 1+0 records out 00:08:37.838 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314432 s, 13.0 MB/s 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.838 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:38.105 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:38.105 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:38.105 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.106 1+0 records in 00:08:38.106 1+0 records out 00:08:38.106 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00038526 s, 10.6 MB/s 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.106 1+0 records in 00:08:38.106 1+0 records out 00:08:38.106 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000369034 s, 11.1 MB/s 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.106 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:38.371 10:03:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:38.371 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.371 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.371 10:03:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.371 1+0 records in 00:08:38.371 1+0 records out 00:08:38.371 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0004549 s, 9.0 MB/s 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.371 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.631 1+0 records in 00:08:38.631 1+0 records out 00:08:38.631 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000442378 s, 9.3 MB/s 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.631 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.890 1+0 records in 00:08:38.890 1+0 records out 00:08:38.890 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000465728 s, 8.8 MB/s 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.890 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.150 1+0 records in 00:08:39.150 1+0 records out 00:08:39.150 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000455546 s, 9.0 MB/s 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.150 10:04:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:39.410 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:39.410 { 00:08:39.410 "nbd_device": "/dev/nbd0", 00:08:39.410 "bdev_name": "Malloc0" 00:08:39.410 }, 00:08:39.410 { 00:08:39.410 "nbd_device": "/dev/nbd1", 00:08:39.410 "bdev_name": "Malloc1p0" 00:08:39.410 }, 00:08:39.410 { 00:08:39.410 "nbd_device": "/dev/nbd2", 00:08:39.410 "bdev_name": "Malloc1p1" 00:08:39.410 }, 00:08:39.410 { 00:08:39.410 "nbd_device": "/dev/nbd3", 00:08:39.410 "bdev_name": "Malloc2p0" 00:08:39.410 }, 00:08:39.410 { 00:08:39.411 "nbd_device": "/dev/nbd4", 00:08:39.411 "bdev_name": "Malloc2p1" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd5", 00:08:39.411 "bdev_name": "Malloc2p2" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd6", 00:08:39.411 "bdev_name": "Malloc2p3" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd7", 00:08:39.411 "bdev_name": "Malloc2p4" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd8", 00:08:39.411 "bdev_name": "Malloc2p5" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd9", 00:08:39.411 "bdev_name": "Malloc2p6" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd10", 00:08:39.411 "bdev_name": "Malloc2p7" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd11", 00:08:39.411 "bdev_name": "TestPT" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd12", 00:08:39.411 "bdev_name": "raid0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd13", 00:08:39.411 "bdev_name": "concat0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd14", 00:08:39.411 "bdev_name": "raid1" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd15", 00:08:39.411 "bdev_name": "AIO0" 00:08:39.411 } 00:08:39.411 ]' 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd0", 00:08:39.411 "bdev_name": "Malloc0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd1", 00:08:39.411 "bdev_name": "Malloc1p0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd2", 00:08:39.411 "bdev_name": "Malloc1p1" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd3", 00:08:39.411 "bdev_name": "Malloc2p0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd4", 00:08:39.411 "bdev_name": "Malloc2p1" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd5", 00:08:39.411 "bdev_name": "Malloc2p2" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd6", 00:08:39.411 "bdev_name": "Malloc2p3" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd7", 00:08:39.411 "bdev_name": "Malloc2p4" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd8", 00:08:39.411 "bdev_name": "Malloc2p5" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd9", 00:08:39.411 "bdev_name": "Malloc2p6" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd10", 00:08:39.411 "bdev_name": "Malloc2p7" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd11", 00:08:39.411 "bdev_name": "TestPT" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd12", 00:08:39.411 "bdev_name": "raid0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd13", 00:08:39.411 "bdev_name": "concat0" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd14", 00:08:39.411 "bdev_name": "raid1" 00:08:39.411 }, 00:08:39.411 { 00:08:39.411 "nbd_device": "/dev/nbd15", 00:08:39.411 "bdev_name": "AIO0" 00:08:39.411 } 00:08:39.411 ]' 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:39.411 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:39.672 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:39.933 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.193 10:04:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:40.193 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.193 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.193 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.193 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.453 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.714 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.975 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.236 10:04:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.497 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.756 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.757 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.757 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.017 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.277 10:04:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.537 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:42.797 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:43.057 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:43.058 10:04:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:43.318 /dev/nbd0 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.318 1+0 records in 00:08:43.318 1+0 records out 00:08:43.318 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269393 s, 15.2 MB/s 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:43.318 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:43.578 /dev/nbd1 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.578 1+0 records in 00:08:43.578 1+0 records out 00:08:43.578 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278282 s, 14.7 MB/s 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:43.578 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:43.579 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:43.839 /dev/nbd10 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.839 1+0 records in 00:08:43.839 1+0 records out 00:08:43.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254206 s, 16.1 MB/s 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:43.839 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:44.099 /dev/nbd11 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.099 1+0 records in 00:08:44.099 1+0 records out 00:08:44.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269988 s, 15.2 MB/s 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.099 10:04:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:44.359 /dev/nbd12 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.359 1+0 records in 00:08:44.359 1+0 records out 00:08:44.359 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304515 s, 13.5 MB/s 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:44.359 /dev/nbd13 00:08:44.359 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:44.618 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:44.618 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.619 1+0 records in 00:08:44.619 1+0 records out 00:08:44.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306964 s, 13.3 MB/s 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:44.619 /dev/nbd14 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.619 1+0 records in 00:08:44.619 1+0 records out 00:08:44.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000425309 s, 9.6 MB/s 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.619 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:44.879 /dev/nbd15 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.879 1+0 records in 00:08:44.879 1+0 records out 00:08:44.879 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308108 s, 13.3 MB/s 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.879 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:45.139 /dev/nbd2 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.139 1+0 records in 00:08:45.139 1+0 records out 00:08:45.139 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346512 s, 11.8 MB/s 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.139 10:04:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:45.400 /dev/nbd3 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.400 1+0 records in 00:08:45.400 1+0 records out 00:08:45.400 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209178 s, 19.6 MB/s 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.400 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:45.660 /dev/nbd4 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.660 1+0 records in 00:08:45.660 1+0 records out 00:08:45.660 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000377675 s, 10.8 MB/s 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.660 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:45.921 /dev/nbd5 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.921 1+0 records in 00:08:45.921 1+0 records out 00:08:45.921 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000403322 s, 10.2 MB/s 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.921 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:46.181 /dev/nbd6 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.181 1+0 records in 00:08:46.181 1+0 records out 00:08:46.181 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000520611 s, 7.9 MB/s 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.181 10:04:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:46.181 /dev/nbd7 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.442 1+0 records in 00:08:46.442 1+0 records out 00:08:46.442 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00044826 s, 9.1 MB/s 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:46.442 /dev/nbd8 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:46.442 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.702 1+0 records in 00:08:46.702 1+0 records out 00:08:46.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000510508 s, 8.0 MB/s 00:08:46.702 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.702 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:46.702 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.702 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:46.702 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:46.703 /dev/nbd9 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.703 1+0 records in 00:08:46.703 1+0 records out 00:08:46.703 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000397246 s, 10.3 MB/s 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:46.703 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd0", 00:08:46.964 "bdev_name": "Malloc0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd1", 00:08:46.964 "bdev_name": "Malloc1p0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd10", 00:08:46.964 "bdev_name": "Malloc1p1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd11", 00:08:46.964 "bdev_name": "Malloc2p0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd12", 00:08:46.964 "bdev_name": "Malloc2p1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd13", 00:08:46.964 "bdev_name": "Malloc2p2" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd14", 00:08:46.964 "bdev_name": "Malloc2p3" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd15", 00:08:46.964 "bdev_name": "Malloc2p4" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd2", 00:08:46.964 "bdev_name": "Malloc2p5" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd3", 00:08:46.964 "bdev_name": "Malloc2p6" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd4", 00:08:46.964 "bdev_name": "Malloc2p7" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd5", 00:08:46.964 "bdev_name": "TestPT" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd6", 00:08:46.964 "bdev_name": "raid0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd7", 00:08:46.964 "bdev_name": "concat0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd8", 00:08:46.964 "bdev_name": "raid1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd9", 00:08:46.964 "bdev_name": "AIO0" 00:08:46.964 } 00:08:46.964 ]' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd0", 00:08:46.964 "bdev_name": "Malloc0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd1", 00:08:46.964 "bdev_name": "Malloc1p0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd10", 00:08:46.964 "bdev_name": "Malloc1p1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd11", 00:08:46.964 "bdev_name": "Malloc2p0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd12", 00:08:46.964 "bdev_name": "Malloc2p1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd13", 00:08:46.964 "bdev_name": "Malloc2p2" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd14", 00:08:46.964 "bdev_name": "Malloc2p3" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd15", 00:08:46.964 "bdev_name": "Malloc2p4" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd2", 00:08:46.964 "bdev_name": "Malloc2p5" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd3", 00:08:46.964 "bdev_name": "Malloc2p6" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd4", 00:08:46.964 "bdev_name": "Malloc2p7" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd5", 00:08:46.964 "bdev_name": "TestPT" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd6", 00:08:46.964 "bdev_name": "raid0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd7", 00:08:46.964 "bdev_name": "concat0" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd8", 00:08:46.964 "bdev_name": "raid1" 00:08:46.964 }, 00:08:46.964 { 00:08:46.964 "nbd_device": "/dev/nbd9", 00:08:46.964 "bdev_name": "AIO0" 00:08:46.964 } 00:08:46.964 ]' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:46.964 /dev/nbd1 00:08:46.964 /dev/nbd10 00:08:46.964 /dev/nbd11 00:08:46.964 /dev/nbd12 00:08:46.964 /dev/nbd13 00:08:46.964 /dev/nbd14 00:08:46.964 /dev/nbd15 00:08:46.964 /dev/nbd2 00:08:46.964 /dev/nbd3 00:08:46.964 /dev/nbd4 00:08:46.964 /dev/nbd5 00:08:46.964 /dev/nbd6 00:08:46.964 /dev/nbd7 00:08:46.964 /dev/nbd8 00:08:46.964 /dev/nbd9' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:46.964 /dev/nbd1 00:08:46.964 /dev/nbd10 00:08:46.964 /dev/nbd11 00:08:46.964 /dev/nbd12 00:08:46.964 /dev/nbd13 00:08:46.964 /dev/nbd14 00:08:46.964 /dev/nbd15 00:08:46.964 /dev/nbd2 00:08:46.964 /dev/nbd3 00:08:46.964 /dev/nbd4 00:08:46.964 /dev/nbd5 00:08:46.964 /dev/nbd6 00:08:46.964 /dev/nbd7 00:08:46.964 /dev/nbd8 00:08:46.964 /dev/nbd9' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:46.964 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:47.225 256+0 records in 00:08:47.225 256+0 records out 00:08:47.225 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0121669 s, 86.2 MB/s 00:08:47.225 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.225 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:47.225 256+0 records in 00:08:47.225 256+0 records out 00:08:47.225 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0843834 s, 12.4 MB/s 00:08:47.225 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.225 10:04:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:47.225 256+0 records in 00:08:47.225 256+0 records out 00:08:47.225 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0842336 s, 12.4 MB/s 00:08:47.225 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.225 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:47.485 256+0 records in 00:08:47.485 256+0 records out 00:08:47.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.083247 s, 12.6 MB/s 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:47.485 256+0 records in 00:08:47.485 256+0 records out 00:08:47.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.085091 s, 12.3 MB/s 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:47.485 256+0 records in 00:08:47.485 256+0 records out 00:08:47.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0863666 s, 12.1 MB/s 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.485 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:47.745 256+0 records in 00:08:47.745 256+0 records out 00:08:47.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0885104 s, 11.8 MB/s 00:08:47.745 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.745 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:47.745 256+0 records in 00:08:47.745 256+0 records out 00:08:47.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0822881 s, 12.7 MB/s 00:08:47.745 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.745 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:47.745 256+0 records in 00:08:47.745 256+0 records out 00:08:47.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0892356 s, 11.8 MB/s 00:08:47.746 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:47.746 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:48.006 256+0 records in 00:08:48.006 256+0 records out 00:08:48.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0807617 s, 13.0 MB/s 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:48.006 256+0 records in 00:08:48.006 256+0 records out 00:08:48.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0878744 s, 11.9 MB/s 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:48.006 256+0 records in 00:08:48.006 256+0 records out 00:08:48.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0859043 s, 12.2 MB/s 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.006 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:48.267 256+0 records in 00:08:48.267 256+0 records out 00:08:48.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0911981 s, 11.5 MB/s 00:08:48.267 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.267 10:04:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:48.267 256+0 records in 00:08:48.267 256+0 records out 00:08:48.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0908271 s, 11.5 MB/s 00:08:48.267 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.267 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:48.267 256+0 records in 00:08:48.267 256+0 records out 00:08:48.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0940717 s, 11.1 MB/s 00:08:48.267 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.267 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:48.527 256+0 records in 00:08:48.527 256+0 records out 00:08:48.527 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0905887 s, 11.6 MB/s 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:48.527 256+0 records in 00:08:48.527 256+0 records out 00:08:48.527 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0811155 s, 12.9 MB/s 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.527 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.788 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.048 10:04:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.308 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.567 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.568 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.828 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.089 10:04:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.350 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.611 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.871 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.133 10:04:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:51.408 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.727 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.997 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:52.257 10:04:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:52.518 malloc_lvol_verify 00:08:52.518 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:52.518 8b07d5d6-ae59-4765-9f57-4497249da815 00:08:52.778 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:52.778 f139ee44-b961-4344-8b53-9a92c8e15aac 00:08:52.778 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:53.038 /dev/nbd0 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:53.038 mke2fs 1.46.5 (30-Dec-2021) 00:08:53.038 Discarding device blocks: 0/4096 done 00:08:53.038 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:53.038 00:08:53.038 Allocating group tables: 0/1 done 00:08:53.038 Writing inode tables: 0/1 done 00:08:53.038 Creating journal (1024 blocks): done 00:08:53.038 Writing superblocks and filesystem accounting information: 0/1 done 00:08:53.038 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.038 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 935836 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 935836 ']' 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 935836 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:53.298 10:04:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 935836 00:08:53.298 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:53.298 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:53.298 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 935836' 00:08:53.298 killing process with pid 935836 00:08:53.298 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # kill 935836 00:08:53.298 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@973 -- # wait 935836 00:08:53.560 10:04:15 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:53.560 00:08:53.560 real 0m18.862s 00:08:53.560 user 0m26.431s 00:08:53.560 sys 0m7.793s 00:08:53.560 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:53.560 10:04:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:53.560 ************************************ 00:08:53.560 END TEST bdev_nbd 00:08:53.560 ************************************ 00:08:53.560 10:04:15 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:53.560 10:04:15 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:53.560 10:04:15 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:53.560 10:04:15 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:53.560 10:04:15 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:53.560 10:04:15 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:53.560 10:04:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:53.560 ************************************ 00:08:53.560 START TEST bdev_fio 00:08:53.560 ************************************ 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:53.560 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:53.560 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:53.561 10:04:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.561 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:53.561 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:53.561 10:04:15 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:53.561 ************************************ 00:08:53.561 START TEST bdev_fio_rw_verify 00:08:53.561 ************************************ 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:08:53.561 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:53.832 10:04:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:54.093 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:54.093 fio-3.35 00:08:54.093 Starting 16 threads 00:09:06.320 00:09:06.320 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=939869: Mon Jun 10 10:04:26 2024 00:09:06.320 read: IOPS=113k, BW=441MiB/s (463MB/s)(4413MiB/10001msec) 00:09:06.320 slat (usec): min=2, max=527, avg=27.07, stdev=16.78 00:09:06.320 clat (usec): min=6, max=902, avg=226.01, stdev=122.51 00:09:06.320 lat (usec): min=11, max=957, avg=253.08, stdev=128.96 00:09:06.320 clat percentiles (usec): 00:09:06.320 | 50.000th=[ 219], 99.000th=[ 529], 99.900th=[ 619], 99.990th=[ 693], 00:09:06.320 | 99.999th=[ 750] 00:09:06.320 write: IOPS=172k, BW=674MiB/s (706MB/s)(6636MiB/9852msec); 0 zone resets 00:09:06.320 slat (usec): min=3, max=1273, avg=41.37, stdev=18.73 00:09:06.320 clat (usec): min=7, max=3246, avg=286.39, stdev=144.05 00:09:06.320 lat (usec): min=20, max=3292, avg=327.76, stdev=150.80 00:09:06.320 clat percentiles (usec): 00:09:06.320 | 50.000th=[ 277], 99.000th=[ 627], 99.900th=[ 717], 99.990th=[ 799], 00:09:06.320 | 99.999th=[ 971] 00:09:06.320 bw ( KiB/s): min=586024, max=791139, per=98.74%, avg=681049.84, stdev=3581.71, samples=304 00:09:06.320 iops : min=146506, max=197783, avg=170262.26, stdev=895.41, samples=304 00:09:06.320 lat (usec) : 10=0.01%, 20=0.20%, 50=2.98%, 100=10.10%, 250=36.91% 00:09:06.320 lat (usec) : 500=43.95%, 750=5.84%, 1000=0.02% 00:09:06.320 lat (msec) : 2=0.01%, 4=0.01% 00:09:06.320 cpu : usr=99.39%, sys=0.29%, ctx=618, majf=0, minf=2738 00:09:06.320 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:06.320 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:06.320 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:06.320 issued rwts: total=1129654,1698857,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:06.320 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:06.320 00:09:06.320 Run status group 0 (all jobs): 00:09:06.320 READ: bw=441MiB/s (463MB/s), 441MiB/s-441MiB/s (463MB/s-463MB/s), io=4413MiB (4627MB), run=10001-10001msec 00:09:06.320 WRITE: bw=674MiB/s (706MB/s), 674MiB/s-674MiB/s (706MB/s-706MB/s), io=6636MiB (6959MB), run=9852-9852msec 00:09:06.320 00:09:06.320 real 0m11.404s 00:09:06.320 user 2m50.982s 00:09:06.320 sys 0m1.867s 00:09:06.320 10:04:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:06.320 10:04:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:06.320 ************************************ 00:09:06.320 END TEST bdev_fio_rw_verify 00:09:06.320 ************************************ 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:09:06.320 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:06.321 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "6ce5d5a1-a22e-4fee-8fb1-e879789eec27"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6ce5d5a1-a22e-4fee-8fb1-e879789eec27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "011f1cf8-1435-595b-9b6b-16399b73cc50"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "011f1cf8-1435-595b-9b6b-16399b73cc50",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "758354bf-1639-5fee-820d-2f1ba8824452"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "758354bf-1639-5fee-820d-2f1ba8824452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5d76671c-876e-5564-b40f-145b1b49403b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5d76671c-876e-5564-b40f-145b1b49403b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "ba640bda-2909-5826-8776-c91c878faba8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ba640bda-2909-5826-8776-c91c878faba8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5230c032-14ef-599e-b99f-21a574b35c58"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5230c032-14ef-599e-b99f-21a574b35c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b6e401c9-4756-57d7-85c6-458d93dcf83d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b6e401c9-4756-57d7-85c6-458d93dcf83d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "535872ac-1714-53bf-9241-0ee51d267f88"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "535872ac-1714-53bf-9241-0ee51d267f88",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0d81cef9-a3dc-5d43-8077-99105869160d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0d81cef9-a3dc-5d43-8077-99105869160d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "9ce701f8-5e0c-5de7-9c99-b93624485529"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ce701f8-5e0c-5de7-9c99-b93624485529",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4c32242-71df-555a-a60f-bedfbec7e637"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4c32242-71df-555a-a60f-bedfbec7e637",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "c6b4156b-a1df-45af-af71-67b707d91d8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8fc2b5c7-b2fa-41a2-b433-3f026e501075",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0ccf722e-169d-4cc5-bd37-fe852a959dbe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "aab36a6c-e602-4fa3-ba50-b01b5e69cd2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a793b199-9798-422a-a354-8d8361eeb95f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c8f91d05-d93a-4357-8c91-a25130f4d7d6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "755195b0-8bc5-4b3b-ba96-28b91490983d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "157b73e3-5dff-4676-afb9-f09f9ffddc83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:06.321 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:06.321 Malloc1p0 00:09:06.321 Malloc1p1 00:09:06.321 Malloc2p0 00:09:06.321 Malloc2p1 00:09:06.321 Malloc2p2 00:09:06.321 Malloc2p3 00:09:06.321 Malloc2p4 00:09:06.321 Malloc2p5 00:09:06.321 Malloc2p6 00:09:06.321 Malloc2p7 00:09:06.321 TestPT 00:09:06.321 raid0 00:09:06.321 concat0 ]] 00:09:06.321 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "6ce5d5a1-a22e-4fee-8fb1-e879789eec27"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6ce5d5a1-a22e-4fee-8fb1-e879789eec27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "011f1cf8-1435-595b-9b6b-16399b73cc50"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "011f1cf8-1435-595b-9b6b-16399b73cc50",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "758354bf-1639-5fee-820d-2f1ba8824452"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "758354bf-1639-5fee-820d-2f1ba8824452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5d76671c-876e-5564-b40f-145b1b49403b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5d76671c-876e-5564-b40f-145b1b49403b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ca2ab964-7b7c-5d50-a074-ccbe9cb74f45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "ba640bda-2909-5826-8776-c91c878faba8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ba640bda-2909-5826-8776-c91c878faba8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5230c032-14ef-599e-b99f-21a574b35c58"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5230c032-14ef-599e-b99f-21a574b35c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b6e401c9-4756-57d7-85c6-458d93dcf83d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b6e401c9-4756-57d7-85c6-458d93dcf83d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "535872ac-1714-53bf-9241-0ee51d267f88"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "535872ac-1714-53bf-9241-0ee51d267f88",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0d81cef9-a3dc-5d43-8077-99105869160d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0d81cef9-a3dc-5d43-8077-99105869160d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "9ce701f8-5e0c-5de7-9c99-b93624485529"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ce701f8-5e0c-5de7-9c99-b93624485529",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4c32242-71df-555a-a60f-bedfbec7e637"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4c32242-71df-555a-a60f-bedfbec7e637",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "75c0b16a-ebb7-46ee-b58c-ca4f4e3fe9b5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "c6b4156b-a1df-45af-af71-67b707d91d8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8fc2b5c7-b2fa-41a2-b433-3f026e501075",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0ccf722e-169d-4cc5-bd37-fe852a959dbe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0ccf722e-169d-4cc5-bd37-fe852a959dbe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "aab36a6c-e602-4fa3-ba50-b01b5e69cd2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a793b199-9798-422a-a354-8d8361eeb95f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c8f91d05-d93a-4357-8c91-a25130f4d7d6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c8f91d05-d93a-4357-8c91-a25130f4d7d6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "755195b0-8bc5-4b3b-ba96-28b91490983d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "157b73e3-5dff-4676-afb9-f09f9ffddc83",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5b3e88ac-cb0e-466b-8bb7-edc65800a6d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:06.323 10:04:26 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:06.323 ************************************ 00:09:06.323 START TEST bdev_fio_trim 00:09:06.323 ************************************ 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.323 10:04:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:06.323 10:04:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.323 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:06.323 fio-3.35 00:09:06.323 Starting 14 threads 00:09:16.327 00:09:16.327 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=941954: Mon Jun 10 10:04:38 2024 00:09:16.327 write: IOPS=172k, BW=672MiB/s (704MB/s)(6718MiB/10001msec); 0 zone resets 00:09:16.327 slat (usec): min=2, max=454, avg=28.89, stdev=13.88 00:09:16.327 clat (usec): min=11, max=2080, avg=209.41, stdev=84.74 00:09:16.327 lat (usec): min=20, max=2118, avg=238.29, stdev=88.64 00:09:16.327 clat percentiles (usec): 00:09:16.327 | 50.000th=[ 200], 99.000th=[ 437], 99.900th=[ 506], 99.990th=[ 570], 00:09:16.327 | 99.999th=[ 848] 00:09:16.327 bw ( KiB/s): min=613048, max=857171, per=100.00%, avg=690530.95, stdev=6092.32, samples=266 00:09:16.327 iops : min=153262, max=214292, avg=172632.63, stdev=1523.06, samples=266 00:09:16.327 trim: IOPS=172k, BW=672MiB/s (704MB/s)(6718MiB/10001msec); 0 zone resets 00:09:16.327 slat (usec): min=3, max=1795, avg=18.39, stdev= 8.64 00:09:16.327 clat (usec): min=3, max=2118, avg=231.63, stdev=91.77 00:09:16.327 lat (usec): min=9, max=2140, avg=250.02, stdev=94.75 00:09:16.327 clat percentiles (usec): 00:09:16.327 | 50.000th=[ 223], 99.000th=[ 469], 99.900th=[ 537], 99.990th=[ 603], 00:09:16.327 | 99.999th=[ 758] 00:09:16.327 bw ( KiB/s): min=613056, max=857171, per=100.00%, avg=690531.37, stdev=6092.32, samples=266 00:09:16.327 iops : min=153264, max=214292, avg=172632.74, stdev=1523.06, samples=266 00:09:16.327 lat (usec) : 4=0.01%, 10=0.04%, 20=0.11%, 50=0.87%, 100=5.57% 00:09:16.327 lat (usec) : 250=59.83%, 500=33.31%, 750=0.25%, 1000=0.01% 00:09:16.327 lat (msec) : 2=0.01%, 4=0.01% 00:09:16.327 cpu : usr=99.71%, sys=0.00%, ctx=604, majf=0, minf=1047 00:09:16.327 IO depths : 1=12.4%, 2=24.8%, 4=50.1%, 8=12.8%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:16.327 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:16.327 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:16.327 issued rwts: total=0,1719904,1719908,0 short=0,0,0,0 dropped=0,0,0,0 00:09:16.327 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:16.327 00:09:16.327 Run status group 0 (all jobs): 00:09:16.327 WRITE: bw=672MiB/s (704MB/s), 672MiB/s-672MiB/s (704MB/s-704MB/s), io=6718MiB (7045MB), run=10001-10001msec 00:09:16.327 TRIM: bw=672MiB/s (704MB/s), 672MiB/s-672MiB/s (704MB/s-704MB/s), io=6718MiB (7045MB), run=10001-10001msec 00:09:16.588 00:09:16.589 real 0m11.221s 00:09:16.589 user 2m30.554s 00:09:16.589 sys 0m0.767s 00:09:16.589 10:04:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:16.589 10:04:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:16.589 ************************************ 00:09:16.589 END TEST bdev_fio_trim 00:09:16.589 ************************************ 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:16.589 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:16.589 00:09:16.589 real 0m22.977s 00:09:16.589 user 5m21.738s 00:09:16.589 sys 0m2.808s 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:16.589 10:04:38 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:16.589 ************************************ 00:09:16.589 END TEST bdev_fio 00:09:16.589 ************************************ 00:09:16.589 10:04:38 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:16.589 10:04:38 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:16.589 10:04:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:09:16.589 10:04:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:16.589 10:04:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:16.589 ************************************ 00:09:16.589 START TEST bdev_verify 00:09:16.589 ************************************ 00:09:16.589 10:04:38 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:16.589 [2024-06-10 10:04:38.386804] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:16.589 [2024-06-10 10:04:38.386856] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid943830 ] 00:09:16.850 [2024-06-10 10:04:38.472163] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:16.850 [2024-06-10 10:04:38.536010] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.850 [2024-06-10 10:04:38.536095] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.850 [2024-06-10 10:04:38.655301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:16.850 [2024-06-10 10:04:38.655341] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:16.850 [2024-06-10 10:04:38.655349] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:16.850 [2024-06-10 10:04:38.663310] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:16.850 [2024-06-10 10:04:38.663329] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:16.850 [2024-06-10 10:04:38.671321] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:16.850 [2024-06-10 10:04:38.671337] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:17.111 [2024-06-10 10:04:38.732449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:17.111 [2024-06-10 10:04:38.732487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:17.111 [2024-06-10 10:04:38.732497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca08f0 00:09:17.111 [2024-06-10 10:04:38.732503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:17.111 [2024-06-10 10:04:38.733701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:17.111 [2024-06-10 10:04:38.733722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:17.111 Running I/O for 5 seconds... 00:09:22.401 00:09:22.401 Latency(us) 00:09:22.401 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:22.401 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.401 Verification LBA range: start 0x0 length 0x1000 00:09:22.401 Malloc0 : 5.22 1104.67 4.32 0.00 0.00 115641.97 705.77 212941.59 00:09:22.401 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.401 Verification LBA range: start 0x1000 length 0x1000 00:09:22.401 Malloc0 : 5.22 1079.36 4.22 0.00 0.00 118362.48 422.20 372647.78 00:09:22.401 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.401 Verification LBA range: start 0x0 length 0x800 00:09:22.401 Malloc1p0 : 5.23 563.17 2.20 0.00 0.00 225930.87 2772.68 208102.01 00:09:22.401 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.401 Verification LBA range: start 0x800 length 0x800 00:09:22.401 Malloc1p0 : 5.22 564.02 2.20 0.00 0.00 225744.25 2734.87 212941.59 00:09:22.401 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x800 00:09:22.402 Malloc1p1 : 5.23 562.69 2.20 0.00 0.00 225526.85 2016.49 199229.44 00:09:22.402 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x800 length 0x800 00:09:22.402 Malloc1p1 : 5.22 563.83 2.20 0.00 0.00 225223.37 2003.89 206488.81 00:09:22.402 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x200 00:09:22.402 Malloc2p0 : 5.24 562.15 2.20 0.00 0.00 225154.77 2344.17 198422.84 00:09:22.402 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x200 length 0x200 00:09:22.402 Malloc2p0 : 5.22 563.62 2.20 0.00 0.00 224715.61 2331.57 196809.65 00:09:22.402 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x200 00:09:22.402 Malloc2p1 : 5.24 561.62 2.19 0.00 0.00 224748.54 2155.13 195196.46 00:09:22.402 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x200 length 0x200 00:09:22.402 Malloc2p1 : 5.23 563.16 2.20 0.00 0.00 224296.98 2167.73 184710.70 00:09:22.402 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x200 00:09:22.402 Malloc2p2 : 5.25 561.21 2.19 0.00 0.00 224323.11 2066.90 194389.86 00:09:22.402 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x200 length 0x200 00:09:22.402 Malloc2p2 : 5.23 562.68 2.20 0.00 0.00 223842.70 2079.51 176644.73 00:09:22.402 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x200 00:09:22.402 Malloc2p3 : 5.25 560.73 2.19 0.00 0.00 223983.21 2306.36 197616.25 00:09:22.402 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x200 length 0x200 00:09:22.402 Malloc2p3 : 5.24 562.14 2.20 0.00 0.00 223529.10 2344.17 175838.13 00:09:22.402 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x0 length 0x200 00:09:22.402 Malloc2p4 : 5.25 560.26 2.19 0.00 0.00 223681.57 2192.94 202455.83 00:09:22.402 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.402 Verification LBA range: start 0x200 length 0x200 00:09:22.402 Malloc2p4 : 5.24 561.61 2.19 0.00 0.00 223247.79 2205.54 177451.32 00:09:22.662 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x200 00:09:22.662 Malloc2p5 : 5.26 559.77 2.19 0.00 0.00 223323.09 2218.14 198422.84 00:09:22.662 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x200 length 0x200 00:09:22.662 Malloc2p5 : 5.25 561.20 2.19 0.00 0.00 222907.80 2218.14 179064.52 00:09:22.662 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x200 00:09:22.662 Malloc2p6 : 5.26 559.53 2.19 0.00 0.00 222834.49 1978.68 196809.65 00:09:22.662 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x200 length 0x200 00:09:22.662 Malloc2p6 : 5.25 560.72 2.19 0.00 0.00 222486.27 2003.89 178257.92 00:09:22.662 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x200 00:09:22.662 Malloc2p7 : 5.26 559.20 2.18 0.00 0.00 222442.50 2104.71 196003.05 00:09:22.662 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x200 length 0x200 00:09:22.662 Malloc2p7 : 5.25 560.25 2.19 0.00 0.00 222105.95 2142.52 179871.11 00:09:22.662 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x1000 00:09:22.662 TestPT : 5.27 558.89 2.18 0.00 0.00 222010.99 1966.08 187937.08 00:09:22.662 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x1000 length 0x1000 00:09:22.662 TestPT : 5.27 539.13 2.11 0.00 0.00 229905.32 11746.07 256497.82 00:09:22.662 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x2000 00:09:22.662 raid0 : 5.27 558.59 2.18 0.00 0.00 221445.11 2003.89 190356.87 00:09:22.662 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x2000 length 0x2000 00:09:22.662 raid0 : 5.26 559.58 2.19 0.00 0.00 221245.91 2054.30 179064.52 00:09:22.662 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x2000 00:09:22.662 concat0 : 5.27 558.25 2.18 0.00 0.00 221160.29 2016.49 197616.25 00:09:22.662 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x2000 length 0x2000 00:09:22.662 concat0 : 5.26 559.25 2.18 0.00 0.00 220951.63 2016.49 190356.87 00:09:22.662 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x1000 00:09:22.662 raid1 : 5.28 558.08 2.18 0.00 0.00 220774.80 2344.17 206488.81 00:09:22.662 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x1000 length 0x1000 00:09:22.662 raid1 : 5.27 558.93 2.18 0.00 0.00 220647.66 2432.39 199229.44 00:09:22.662 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.662 Verification LBA range: start 0x0 length 0x4e2 00:09:22.662 AIO0 : 5.28 557.94 2.18 0.00 0.00 220478.65 913.72 214554.78 00:09:22.662 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.663 Verification LBA range: start 0x4e2 length 0x4e2 00:09:22.663 AIO0 : 5.27 558.66 2.18 0.00 0.00 220347.11 913.72 206488.81 00:09:22.663 =================================================================================================================== 00:09:22.663 Total : 18984.90 74.16 0.00 0.00 211125.53 422.20 372647.78 00:09:22.663 00:09:22.663 real 0m6.136s 00:09:22.663 user 0m11.656s 00:09:22.663 sys 0m0.245s 00:09:22.663 10:04:44 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:22.663 10:04:44 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:22.663 ************************************ 00:09:22.663 END TEST bdev_verify 00:09:22.663 ************************************ 00:09:22.663 10:04:44 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:22.663 10:04:44 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:09:22.663 10:04:44 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:22.663 10:04:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:22.923 ************************************ 00:09:22.923 START TEST bdev_verify_big_io 00:09:22.923 ************************************ 00:09:22.923 10:04:44 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:22.923 [2024-06-10 10:04:44.611993] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:22.923 [2024-06-10 10:04:44.612036] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid944775 ] 00:09:22.923 [2024-06-10 10:04:44.696486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:22.923 [2024-06-10 10:04:44.759396] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:22.923 [2024-06-10 10:04:44.759401] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.184 [2024-06-10 10:04:44.881703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:23.184 [2024-06-10 10:04:44.881747] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:23.184 [2024-06-10 10:04:44.881755] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:23.184 [2024-06-10 10:04:44.889710] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:23.184 [2024-06-10 10:04:44.889728] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:23.184 [2024-06-10 10:04:44.897725] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:23.184 [2024-06-10 10:04:44.897740] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:23.184 [2024-06-10 10:04:44.958612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:23.184 [2024-06-10 10:04:44.958651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:23.184 [2024-06-10 10:04:44.958661] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x256f8f0 00:09:23.184 [2024-06-10 10:04:44.958668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:23.184 [2024-06-10 10:04:44.959883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:23.184 [2024-06-10 10:04:44.959902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:23.445 [2024-06-10 10:04:45.113414] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.114381] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.115700] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.116639] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.117991] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.118980] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.120358] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.121686] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.122479] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.123551] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.124277] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.125344] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.126103] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.127164] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.127891] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.128945] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:23.446 [2024-06-10 10:04:45.144200] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:23.446 [2024-06-10 10:04:45.145448] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:23.446 Running I/O for 5 seconds... 00:09:31.582 00:09:31.582 Latency(us) 00:09:31.582 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:31.582 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x100 00:09:31.582 Malloc0 : 5.91 173.19 10.82 0.00 0.00 725139.86 724.68 2168132.53 00:09:31.582 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x100 length 0x100 00:09:31.582 Malloc0 : 5.61 159.66 9.98 0.00 0.00 785826.15 724.68 2181038.08 00:09:31.582 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x80 00:09:31.582 Malloc1p0 : 6.37 60.26 3.77 0.00 0.00 1958784.43 1802.24 3045709.98 00:09:31.582 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x80 length 0x80 00:09:31.582 Malloc1p0 : 6.17 70.67 4.42 0.00 0.00 1639885.41 2003.89 2619826.81 00:09:31.582 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x80 00:09:31.582 Malloc1p1 : 6.68 38.31 2.39 0.00 0.00 2938550.54 1417.85 5033164.80 00:09:31.582 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x80 length 0x80 00:09:31.582 Malloc1p1 : 6.64 38.53 2.41 0.00 0.00 2928828.94 1197.29 5058975.90 00:09:31.582 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x20 00:09:31.582 Malloc2p0 : 6.17 25.92 1.62 0.00 0.00 1084419.65 488.37 1793871.56 00:09:31.582 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x20 length 0x20 00:09:31.582 Malloc2p0 : 6.17 25.93 1.62 0.00 0.00 1083935.48 504.12 1858399.31 00:09:31.582 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x20 00:09:31.582 Malloc2p1 : 6.27 28.07 1.75 0.00 0.00 1010340.03 478.92 1768060.46 00:09:31.582 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x20 length 0x20 00:09:31.582 Malloc2p1 : 6.17 25.91 1.62 0.00 0.00 1074849.23 482.07 1832588.21 00:09:31.582 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x20 00:09:31.582 Malloc2p2 : 6.27 28.06 1.75 0.00 0.00 1001959.27 491.52 1742249.35 00:09:31.582 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x20 length 0x20 00:09:31.582 Malloc2p2 : 6.27 28.08 1.75 0.00 0.00 1000249.96 488.37 1806777.11 00:09:31.582 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.582 Verification LBA range: start 0x0 length 0x20 00:09:31.582 Malloc2p3 : 6.28 28.04 1.75 0.00 0.00 993451.90 478.92 1716438.25 00:09:31.583 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x20 length 0x20 00:09:31.583 Malloc2p3 : 6.27 28.07 1.75 0.00 0.00 991735.45 491.52 1780966.01 00:09:31.583 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x20 00:09:31.583 Malloc2p4 : 6.28 28.02 1.75 0.00 0.00 984997.03 478.92 1703532.70 00:09:31.583 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x20 length 0x20 00:09:31.583 Malloc2p4 : 6.27 28.06 1.75 0.00 0.00 983736.26 485.22 1755154.90 00:09:31.583 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x20 00:09:31.583 Malloc2p5 : 6.28 28.01 1.75 0.00 0.00 976838.81 488.37 1677721.60 00:09:31.583 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x20 length 0x20 00:09:31.583 Malloc2p5 : 6.28 28.04 1.75 0.00 0.00 975641.77 488.37 1742249.35 00:09:31.583 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x20 00:09:31.583 Malloc2p6 : 6.29 27.99 1.75 0.00 0.00 968803.17 482.07 1651910.50 00:09:31.583 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x20 length 0x20 00:09:31.583 Malloc2p6 : 6.28 28.02 1.75 0.00 0.00 967605.84 500.97 1716438.25 00:09:31.583 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x20 00:09:31.583 Malloc2p7 : 6.29 27.99 1.75 0.00 0.00 960521.93 567.14 1626099.40 00:09:31.583 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x20 length 0x20 00:09:31.583 Malloc2p7 : 6.28 28.01 1.75 0.00 0.00 959372.09 488.37 1690627.15 00:09:31.583 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x100 00:09:31.583 TestPT : 6.74 38.30 2.39 0.00 0.00 2688446.95 96791.63 4000720.74 00:09:31.583 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x100 length 0x100 00:09:31.583 TestPT : 6.69 38.28 2.39 0.00 0.00 2694414.97 101631.21 3974909.64 00:09:31.583 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x200 00:09:31.583 raid0 : 6.79 42.42 2.65 0.00 0.00 2357482.98 1310.72 4491131.67 00:09:31.583 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x200 length 0x200 00:09:31.583 raid0 : 6.57 43.87 2.74 0.00 0.00 2297400.43 1317.02 4516942.77 00:09:31.583 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x200 00:09:31.583 concat0 : 6.68 47.87 2.99 0.00 0.00 2049266.03 1279.21 4336265.06 00:09:31.583 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x200 length 0x200 00:09:31.583 concat0 : 6.57 57.25 3.58 0.00 0.00 1734130.12 1298.12 4362076.16 00:09:31.583 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x100 00:09:31.583 raid1 : 6.74 56.98 3.56 0.00 0.00 1693317.66 1802.24 4181398.45 00:09:31.583 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x100 length 0x100 00:09:31.583 raid1 : 6.69 57.40 3.59 0.00 0.00 1676035.91 1789.64 4181398.45 00:09:31.583 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x0 length 0x4e 00:09:31.583 AIO0 : 6.79 74.51 4.66 0.00 0.00 770767.97 365.49 2452054.65 00:09:31.583 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:31.583 Verification LBA range: start 0x4e length 0x4e 00:09:31.583 AIO0 : 6.79 66.57 4.16 0.00 0.00 861728.22 812.90 2632732.36 00:09:31.583 =================================================================================================================== 00:09:31.583 Total : 1506.29 94.14 0.00 0.00 1394077.71 365.49 5058975.90 00:09:31.583 00:09:31.583 real 0m7.732s 00:09:31.583 user 0m14.774s 00:09:31.583 sys 0m0.264s 00:09:31.583 10:04:52 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:31.583 10:04:52 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:31.583 ************************************ 00:09:31.583 END TEST bdev_verify_big_io 00:09:31.583 ************************************ 00:09:31.583 10:04:52 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.583 10:04:52 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:09:31.583 10:04:52 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:31.583 10:04:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.583 ************************************ 00:09:31.583 START TEST bdev_write_zeroes 00:09:31.583 ************************************ 00:09:31.583 10:04:52 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.583 [2024-06-10 10:04:52.406607] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:31.583 [2024-06-10 10:04:52.406654] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid946040 ] 00:09:31.583 [2024-06-10 10:04:52.496618] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.583 [2024-06-10 10:04:52.571615] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.583 [2024-06-10 10:04:52.694156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:31.583 [2024-06-10 10:04:52.694199] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:31.583 [2024-06-10 10:04:52.694208] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:31.583 [2024-06-10 10:04:52.702166] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:31.583 [2024-06-10 10:04:52.702184] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:31.583 [2024-06-10 10:04:52.710177] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:31.583 [2024-06-10 10:04:52.710193] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:31.583 [2024-06-10 10:04:52.771070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:31.583 [2024-06-10 10:04:52.771105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:31.583 [2024-06-10 10:04:52.771114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf89360 00:09:31.583 [2024-06-10 10:04:52.771121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:31.583 [2024-06-10 10:04:52.772297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:31.583 [2024-06-10 10:04:52.772316] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:31.583 Running I/O for 1 seconds... 00:09:32.522 00:09:32.522 Latency(us) 00:09:32.522 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:32.522 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.522 Malloc0 : 1.04 6031.94 23.56 0.00 0.00 21214.56 523.03 35691.91 00:09:32.522 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.522 Malloc1p0 : 1.04 6024.57 23.53 0.00 0.00 21208.76 756.18 34885.32 00:09:32.523 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc1p1 : 1.04 6017.25 23.50 0.00 0.00 21196.93 771.94 34280.37 00:09:32.523 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p0 : 1.04 6009.93 23.48 0.00 0.00 21183.34 765.64 33473.77 00:09:32.523 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p1 : 1.04 6002.65 23.45 0.00 0.00 21170.77 762.49 32667.18 00:09:32.523 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p2 : 1.05 5995.40 23.42 0.00 0.00 21161.00 756.18 32062.23 00:09:32.523 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p3 : 1.05 5988.13 23.39 0.00 0.00 21147.19 749.88 31255.63 00:09:32.523 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p4 : 1.05 5980.91 23.36 0.00 0.00 21132.85 753.03 30449.03 00:09:32.523 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p5 : 1.05 5973.70 23.33 0.00 0.00 21121.60 759.34 29844.09 00:09:32.523 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p6 : 1.05 5966.48 23.31 0.00 0.00 21108.65 753.03 29037.49 00:09:32.523 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 Malloc2p7 : 1.05 5959.31 23.28 0.00 0.00 21093.91 746.73 28230.89 00:09:32.523 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 TestPT : 1.05 5952.17 23.25 0.00 0.00 21079.50 787.69 27424.30 00:09:32.523 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 raid0 : 1.06 5943.96 23.22 0.00 0.00 21059.27 1424.15 26012.75 00:09:32.523 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 concat0 : 1.06 5935.92 23.19 0.00 0.00 21025.48 1405.24 24702.03 00:09:32.523 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 raid1 : 1.06 5925.89 23.15 0.00 0.00 20983.25 2192.94 22483.89 00:09:32.523 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.523 AIO0 : 1.06 5919.74 23.12 0.00 0.00 20913.96 806.60 21576.47 00:09:32.523 =================================================================================================================== 00:09:32.523 Total : 95627.95 373.55 0.00 0.00 21112.56 523.03 35691.91 00:09:32.523 00:09:32.523 real 0m1.866s 00:09:32.523 user 0m1.593s 00:09:32.523 sys 0m0.217s 00:09:32.523 10:04:54 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:32.523 10:04:54 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:32.523 ************************************ 00:09:32.523 END TEST bdev_write_zeroes 00:09:32.523 ************************************ 00:09:32.523 10:04:54 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.523 10:04:54 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:09:32.523 10:04:54 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:32.523 10:04:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.523 ************************************ 00:09:32.523 START TEST bdev_json_nonenclosed 00:09:32.523 ************************************ 00:09:32.523 10:04:54 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.523 [2024-06-10 10:04:54.347594] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:32.523 [2024-06-10 10:04:54.347637] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid946429 ] 00:09:32.783 [2024-06-10 10:04:54.433083] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.783 [2024-06-10 10:04:54.505937] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.783 [2024-06-10 10:04:54.505991] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:32.783 [2024-06-10 10:04:54.506005] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:32.783 [2024-06-10 10:04:54.506011] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:32.783 00:09:32.783 real 0m0.268s 00:09:32.783 user 0m0.170s 00:09:32.783 sys 0m0.096s 00:09:32.783 10:04:54 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:32.783 10:04:54 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:32.783 ************************************ 00:09:32.783 END TEST bdev_json_nonenclosed 00:09:32.783 ************************************ 00:09:32.783 10:04:54 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.783 10:04:54 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:09:32.783 10:04:54 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:32.783 10:04:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.783 ************************************ 00:09:32.783 START TEST bdev_json_nonarray 00:09:32.783 ************************************ 00:09:32.783 10:04:54 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:33.042 [2024-06-10 10:04:54.688193] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:33.043 [2024-06-10 10:04:54.688241] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid946623 ] 00:09:33.043 [2024-06-10 10:04:54.775910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.043 [2024-06-10 10:04:54.851843] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.043 [2024-06-10 10:04:54.851903] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:33.043 [2024-06-10 10:04:54.851914] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:33.043 [2024-06-10 10:04:54.851920] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:33.302 00:09:33.302 real 0m0.273s 00:09:33.302 user 0m0.168s 00:09:33.302 sys 0m0.103s 00:09:33.303 10:04:54 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:33.303 10:04:54 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:33.303 ************************************ 00:09:33.303 END TEST bdev_json_nonarray 00:09:33.303 ************************************ 00:09:33.303 10:04:54 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:33.303 10:04:54 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:33.303 10:04:54 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:33.303 10:04:54 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:33.303 10:04:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:33.303 ************************************ 00:09:33.303 START TEST bdev_qos 00:09:33.303 ************************************ 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # qos_test_suite '' 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=946678 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 946678' 00:09:33.303 Process qos testing pid: 946678 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 946678 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@830 -- # '[' -z 946678 ']' 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:33.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.303 10:04:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:33.303 [2024-06-10 10:04:55.034454] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:09:33.303 [2024-06-10 10:04:55.034495] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid946678 ] 00:09:33.303 [2024-06-10 10:04:55.101709] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.303 [2024-06-10 10:04:55.163005] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@863 -- # return 0 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 Malloc_0 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_0 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 [ 00:09:34.243 { 00:09:34.243 "name": "Malloc_0", 00:09:34.243 "aliases": [ 00:09:34.243 "14b99f1e-3be3-4b0d-9a53-9fe0d862868d" 00:09:34.243 ], 00:09:34.243 "product_name": "Malloc disk", 00:09:34.243 "block_size": 512, 00:09:34.243 "num_blocks": 262144, 00:09:34.243 "uuid": "14b99f1e-3be3-4b0d-9a53-9fe0d862868d", 00:09:34.243 "assigned_rate_limits": { 00:09:34.243 "rw_ios_per_sec": 0, 00:09:34.243 "rw_mbytes_per_sec": 0, 00:09:34.243 "r_mbytes_per_sec": 0, 00:09:34.243 "w_mbytes_per_sec": 0 00:09:34.243 }, 00:09:34.243 "claimed": false, 00:09:34.243 "zoned": false, 00:09:34.243 "supported_io_types": { 00:09:34.243 "read": true, 00:09:34.243 "write": true, 00:09:34.243 "unmap": true, 00:09:34.243 "write_zeroes": true, 00:09:34.243 "flush": true, 00:09:34.243 "reset": true, 00:09:34.243 "compare": false, 00:09:34.243 "compare_and_write": false, 00:09:34.243 "abort": true, 00:09:34.243 "nvme_admin": false, 00:09:34.243 "nvme_io": false 00:09:34.243 }, 00:09:34.243 "memory_domains": [ 00:09:34.243 { 00:09:34.243 "dma_device_id": "system", 00:09:34.243 "dma_device_type": 1 00:09:34.243 }, 00:09:34.243 { 00:09:34.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:34.243 "dma_device_type": 2 00:09:34.243 } 00:09:34.243 ], 00:09:34.243 "driver_specific": {} 00:09:34.243 } 00:09:34.243 ] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 Null_1 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Null_1 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:34.243 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.243 [ 00:09:34.243 { 00:09:34.243 "name": "Null_1", 00:09:34.243 "aliases": [ 00:09:34.243 "8b3b591d-6494-48f8-aa64-56359110483d" 00:09:34.243 ], 00:09:34.243 "product_name": "Null disk", 00:09:34.243 "block_size": 512, 00:09:34.243 "num_blocks": 262144, 00:09:34.243 "uuid": "8b3b591d-6494-48f8-aa64-56359110483d", 00:09:34.243 "assigned_rate_limits": { 00:09:34.243 "rw_ios_per_sec": 0, 00:09:34.243 "rw_mbytes_per_sec": 0, 00:09:34.243 "r_mbytes_per_sec": 0, 00:09:34.243 "w_mbytes_per_sec": 0 00:09:34.243 }, 00:09:34.243 "claimed": false, 00:09:34.243 "zoned": false, 00:09:34.243 "supported_io_types": { 00:09:34.243 "read": true, 00:09:34.243 "write": true, 00:09:34.243 "unmap": false, 00:09:34.243 "write_zeroes": true, 00:09:34.243 "flush": false, 00:09:34.244 "reset": true, 00:09:34.244 "compare": false, 00:09:34.244 "compare_and_write": false, 00:09:34.244 "abort": true, 00:09:34.244 "nvme_admin": false, 00:09:34.244 "nvme_io": false 00:09:34.244 }, 00:09:34.244 "driver_specific": {} 00:09:34.244 } 00:09:34.244 ] 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:34.244 10:04:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:34.244 Running I/O for 60 seconds... 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 75098.47 300393.88 0.00 0.00 303104.00 0.00 0.00 ' 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=75098.47 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 75098 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=75098 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=18000 00:09:39.524 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 18000 -gt 1000 ']' 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 18000 Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 18000 IOPS Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:39.525 10:05:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.525 ************************************ 00:09:39.525 START TEST bdev_qos_iops 00:09:39.525 ************************************ 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # run_qos_test 18000 IOPS Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=18000 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:39.525 10:05:01 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 18000.94 72003.75 0.00 0.00 73368.00 0.00 0.00 ' 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=18000.94 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 18000 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=18000 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=16200 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=19800 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 18000 -lt 16200 ']' 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 18000 -gt 19800 ']' 00:09:44.856 00:09:44.856 real 0m5.212s 00:09:44.856 user 0m0.100s 00:09:44.856 sys 0m0.037s 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:44.856 10:05:06 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:44.856 ************************************ 00:09:44.856 END TEST bdev_qos_iops 00:09:44.856 ************************************ 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:44.856 10:05:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 24841.64 99366.57 0.00 0.00 100352.00 0.00 0.00 ' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=100352.00 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 100352 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=100352 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=9 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 9 -lt 2 ']' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 9 Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 9 BANDWIDTH Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:50.138 10:05:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.138 ************************************ 00:09:50.138 START TEST bdev_qos_bw 00:09:50.138 ************************************ 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # run_qos_test 9 BANDWIDTH Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=9 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:50.138 10:05:11 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2304.13 9216.54 0.00 0.00 9380.00 0.00 0.00 ' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=9380.00 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 9380 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=9380 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=9216 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=8294 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=10137 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 9380 -lt 8294 ']' 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 9380 -gt 10137 ']' 00:09:55.424 00:09:55.424 real 0m5.231s 00:09:55.424 user 0m0.108s 00:09:55.424 sys 0m0.031s 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:55.424 10:05:16 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:55.424 ************************************ 00:09:55.424 END TEST bdev_qos_bw 00:09:55.424 ************************************ 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:55.424 10:05:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.424 ************************************ 00:09:55.424 START TEST bdev_qos_ro_bw 00:09:55.424 ************************************ 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:55.424 10:05:17 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.12 2048.48 0.00 0.00 2060.00 0.00 0.00 ' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:00.706 00:10:00.706 real 0m5.168s 00:10:00.706 user 0m0.109s 00:10:00.706 sys 0m0.031s 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:00.706 10:05:22 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:00.706 ************************************ 00:10:00.706 END TEST bdev_qos_ro_bw 00:10:00.706 ************************************ 00:10:00.706 10:05:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:00.706 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:00.706 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:00.966 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:00.966 10:05:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:00.966 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:00.966 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.227 00:10:01.227 Latency(us) 00:10:01.227 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.227 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.227 Malloc_0 : 26.63 25107.67 98.08 0.00 0.00 10094.07 1701.42 503316.48 00:10:01.227 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.227 Null_1 : 26.76 24947.96 97.45 0.00 0.00 10231.07 699.47 133088.49 00:10:01.227 =================================================================================================================== 00:10:01.227 Total : 50055.63 195.53 0.00 0.00 10162.52 699.47 503316.48 00:10:01.227 0 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 946678 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@949 -- # '[' -z 946678 ']' 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # kill -0 946678 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # uname 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 946678 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # echo 'killing process with pid 946678' 00:10:01.227 killing process with pid 946678 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # kill 946678 00:10:01.227 Received shutdown signal, test time was about 26.824846 seconds 00:10:01.227 00:10:01.227 Latency(us) 00:10:01.227 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.227 =================================================================================================================== 00:10:01.227 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:01.227 10:05:22 blockdev_general.bdev_qos -- common/autotest_common.sh@973 -- # wait 946678 00:10:01.227 10:05:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:01.227 00:10:01.227 real 0m28.047s 00:10:01.227 user 0m28.795s 00:10:01.227 sys 0m0.596s 00:10:01.227 10:05:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:01.227 10:05:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.227 ************************************ 00:10:01.227 END TEST bdev_qos 00:10:01.227 ************************************ 00:10:01.227 10:05:23 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:01.228 10:05:23 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:10:01.228 10:05:23 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:01.228 10:05:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:01.489 ************************************ 00:10:01.489 START TEST bdev_qd_sampling 00:10:01.489 ************************************ 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # qd_sampling_test_suite '' 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=951413 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 951413' 00:10:01.489 Process bdev QD sampling period testing pid: 951413 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 951413 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@830 -- # '[' -z 951413 ']' 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:01.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:01.489 10:05:23 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:01.489 [2024-06-10 10:05:23.173643] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:01.489 [2024-06-10 10:05:23.173696] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid951413 ] 00:10:01.489 [2024-06-10 10:05:23.270008] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:01.750 [2024-06-10 10:05:23.366718] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.750 [2024-06-10 10:05:23.366725] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@863 -- # return 0 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.320 Malloc_QD 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_QD 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local i 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.320 [ 00:10:02.320 { 00:10:02.320 "name": "Malloc_QD", 00:10:02.320 "aliases": [ 00:10:02.320 "68bfc0b2-b1f2-42cd-afd1-8a3cf2f97cf6" 00:10:02.320 ], 00:10:02.320 "product_name": "Malloc disk", 00:10:02.320 "block_size": 512, 00:10:02.320 "num_blocks": 262144, 00:10:02.320 "uuid": "68bfc0b2-b1f2-42cd-afd1-8a3cf2f97cf6", 00:10:02.320 "assigned_rate_limits": { 00:10:02.320 "rw_ios_per_sec": 0, 00:10:02.320 "rw_mbytes_per_sec": 0, 00:10:02.320 "r_mbytes_per_sec": 0, 00:10:02.320 "w_mbytes_per_sec": 0 00:10:02.320 }, 00:10:02.320 "claimed": false, 00:10:02.320 "zoned": false, 00:10:02.320 "supported_io_types": { 00:10:02.320 "read": true, 00:10:02.320 "write": true, 00:10:02.320 "unmap": true, 00:10:02.320 "write_zeroes": true, 00:10:02.320 "flush": true, 00:10:02.320 "reset": true, 00:10:02.320 "compare": false, 00:10:02.320 "compare_and_write": false, 00:10:02.320 "abort": true, 00:10:02.320 "nvme_admin": false, 00:10:02.320 "nvme_io": false 00:10:02.320 }, 00:10:02.320 "memory_domains": [ 00:10:02.320 { 00:10:02.320 "dma_device_id": "system", 00:10:02.320 "dma_device_type": 1 00:10:02.320 }, 00:10:02.320 { 00:10:02.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.320 "dma_device_type": 2 00:10:02.320 } 00:10:02.320 ], 00:10:02.320 "driver_specific": {} 00:10:02.320 } 00:10:02.320 ] 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # return 0 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:02.320 10:05:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:02.320 Running I/O for 5 seconds... 00:10:04.226 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:04.226 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:04.226 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:04.226 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:04.226 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:04.227 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:04.487 "tick_rate": 2600000000, 00:10:04.487 "ticks": 4715924344833373, 00:10:04.487 "bdevs": [ 00:10:04.487 { 00:10:04.487 "name": "Malloc_QD", 00:10:04.487 "bytes_read": 838906368, 00:10:04.487 "num_read_ops": 204804, 00:10:04.487 "bytes_written": 0, 00:10:04.487 "num_write_ops": 0, 00:10:04.487 "bytes_unmapped": 0, 00:10:04.487 "num_unmap_ops": 0, 00:10:04.487 "bytes_copied": 0, 00:10:04.487 "num_copy_ops": 0, 00:10:04.487 "read_latency_ticks": 2565002049568, 00:10:04.487 "max_read_latency_ticks": 18688536, 00:10:04.487 "min_read_latency_ticks": 248126, 00:10:04.487 "write_latency_ticks": 0, 00:10:04.487 "max_write_latency_ticks": 0, 00:10:04.487 "min_write_latency_ticks": 0, 00:10:04.487 "unmap_latency_ticks": 0, 00:10:04.487 "max_unmap_latency_ticks": 0, 00:10:04.487 "min_unmap_latency_ticks": 0, 00:10:04.487 "copy_latency_ticks": 0, 00:10:04.487 "max_copy_latency_ticks": 0, 00:10:04.487 "min_copy_latency_ticks": 0, 00:10:04.487 "io_error": {}, 00:10:04.487 "queue_depth_polling_period": 10, 00:10:04.487 "queue_depth": 512, 00:10:04.487 "io_time": 30, 00:10:04.487 "weighted_io_time": 15360 00:10:04.487 } 00:10:04.487 ] 00:10:04.487 }' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.487 00:10:04.487 Latency(us) 00:10:04.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.487 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:04.487 Malloc_QD : 2.00 52920.56 206.72 0.00 0.00 4826.60 926.33 7208.96 00:10:04.487 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:04.487 Malloc_QD : 2.00 53404.85 208.61 0.00 0.00 4783.08 768.79 7208.96 00:10:04.487 =================================================================================================================== 00:10:04.487 Total : 106325.41 415.33 0.00 0.00 4804.74 768.79 7208.96 00:10:04.487 0 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 951413 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@949 -- # '[' -z 951413 ']' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # kill -0 951413 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # uname 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 951413 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:04.487 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # echo 'killing process with pid 951413' 00:10:04.488 killing process with pid 951413 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # kill 951413 00:10:04.488 Received shutdown signal, test time was about 2.072949 seconds 00:10:04.488 00:10:04.488 Latency(us) 00:10:04.488 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.488 =================================================================================================================== 00:10:04.488 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@973 -- # wait 951413 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:04.488 00:10:04.488 real 0m3.240s 00:10:04.488 user 0m6.405s 00:10:04.488 sys 0m0.338s 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:04.488 10:05:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.488 ************************************ 00:10:04.488 END TEST bdev_qd_sampling 00:10:04.488 ************************************ 00:10:04.748 10:05:26 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:04.748 10:05:26 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:10:04.748 10:05:26 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:04.748 10:05:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:04.748 ************************************ 00:10:04.748 START TEST bdev_error 00:10:04.748 ************************************ 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # error_test_suite '' 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=952014 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 952014' 00:10:04.748 Process error testing pid: 952014 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 952014 00:10:04.748 10:05:26 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 952014 ']' 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:04.748 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:04.748 10:05:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:04.748 [2024-06-10 10:05:26.488507] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:04.748 [2024-06-10 10:05:26.488558] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid952014 ] 00:10:04.748 [2024-06-10 10:05:26.558698] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.008 [2024-06-10 10:05:26.624663] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 Dev_1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 [ 00:10:05.579 { 00:10:05.579 "name": "Dev_1", 00:10:05.579 "aliases": [ 00:10:05.579 "081ff208-e4cd-4895-89f5-a65dc497aea1" 00:10:05.579 ], 00:10:05.579 "product_name": "Malloc disk", 00:10:05.579 "block_size": 512, 00:10:05.579 "num_blocks": 262144, 00:10:05.579 "uuid": "081ff208-e4cd-4895-89f5-a65dc497aea1", 00:10:05.579 "assigned_rate_limits": { 00:10:05.579 "rw_ios_per_sec": 0, 00:10:05.579 "rw_mbytes_per_sec": 0, 00:10:05.579 "r_mbytes_per_sec": 0, 00:10:05.579 "w_mbytes_per_sec": 0 00:10:05.579 }, 00:10:05.579 "claimed": false, 00:10:05.579 "zoned": false, 00:10:05.579 "supported_io_types": { 00:10:05.579 "read": true, 00:10:05.579 "write": true, 00:10:05.579 "unmap": true, 00:10:05.579 "write_zeroes": true, 00:10:05.579 "flush": true, 00:10:05.579 "reset": true, 00:10:05.579 "compare": false, 00:10:05.579 "compare_and_write": false, 00:10:05.579 "abort": true, 00:10:05.579 "nvme_admin": false, 00:10:05.579 "nvme_io": false 00:10:05.579 }, 00:10:05.579 "memory_domains": [ 00:10:05.579 { 00:10:05.579 "dma_device_id": "system", 00:10:05.579 "dma_device_type": 1 00:10:05.579 }, 00:10:05.579 { 00:10:05.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.579 "dma_device_type": 2 00:10:05.579 } 00:10:05.579 ], 00:10:05.579 "driver_specific": {} 00:10:05.579 } 00:10:05.579 ] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 true 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 Dev_2 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 [ 00:10:05.579 { 00:10:05.579 "name": "Dev_2", 00:10:05.579 "aliases": [ 00:10:05.579 "3c895e4a-416e-449c-87ed-f9d15c9cab60" 00:10:05.579 ], 00:10:05.579 "product_name": "Malloc disk", 00:10:05.579 "block_size": 512, 00:10:05.579 "num_blocks": 262144, 00:10:05.579 "uuid": "3c895e4a-416e-449c-87ed-f9d15c9cab60", 00:10:05.579 "assigned_rate_limits": { 00:10:05.579 "rw_ios_per_sec": 0, 00:10:05.579 "rw_mbytes_per_sec": 0, 00:10:05.579 "r_mbytes_per_sec": 0, 00:10:05.579 "w_mbytes_per_sec": 0 00:10:05.579 }, 00:10:05.579 "claimed": false, 00:10:05.579 "zoned": false, 00:10:05.579 "supported_io_types": { 00:10:05.579 "read": true, 00:10:05.579 "write": true, 00:10:05.579 "unmap": true, 00:10:05.579 "write_zeroes": true, 00:10:05.579 "flush": true, 00:10:05.579 "reset": true, 00:10:05.579 "compare": false, 00:10:05.579 "compare_and_write": false, 00:10:05.579 "abort": true, 00:10:05.579 "nvme_admin": false, 00:10:05.579 "nvme_io": false 00:10:05.579 }, 00:10:05.579 "memory_domains": [ 00:10:05.579 { 00:10:05.579 "dma_device_id": "system", 00:10:05.579 "dma_device_type": 1 00:10:05.579 }, 00:10:05.579 { 00:10:05.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.579 "dma_device_type": 2 00:10:05.579 } 00:10:05.579 ], 00:10:05.579 "driver_specific": {} 00:10:05.579 } 00:10:05.579 ] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.579 10:05:27 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:05.579 10:05:27 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:05.840 Running I/O for 5 seconds... 00:10:06.780 10:05:28 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 952014 00:10:06.780 10:05:28 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 952014' 00:10:06.780 Process is existed as continue on error is set. Pid: 952014 00:10:06.780 10:05:28 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:06.780 10:05:28 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.780 10:05:28 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:06.780 10:05:28 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:06.780 Timeout while waiting for response: 00:10:06.780 00:10:06.780 00:10:10.979 00:10:10.979 Latency(us) 00:10:10.979 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:10.979 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:10.979 EE_Dev_1 : 0.92 45478.58 177.65 5.45 0.00 348.84 111.06 557.69 00:10:10.979 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:10.979 Dev_2 : 5.00 99062.07 386.96 0.00 0.00 158.63 55.14 10687.41 00:10:10.979 =================================================================================================================== 00:10:10.979 Total : 144540.65 564.61 5.45 0.00 173.40 55.14 10687.41 00:10:11.919 10:05:33 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 952014 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@949 -- # '[' -z 952014 ']' 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # kill -0 952014 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # uname 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 952014 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # echo 'killing process with pid 952014' 00:10:11.919 killing process with pid 952014 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # kill 952014 00:10:11.919 Received shutdown signal, test time was about 5.000000 seconds 00:10:11.919 00:10:11.919 Latency(us) 00:10:11.919 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:11.919 =================================================================================================================== 00:10:11.919 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@973 -- # wait 952014 00:10:11.919 10:05:33 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=953251 00:10:11.919 10:05:33 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 953251' 00:10:11.919 Process error testing pid: 953251 00:10:11.919 10:05:33 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:11.919 10:05:33 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 953251 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 953251 ']' 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:11.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:11.919 10:05:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:11.919 [2024-06-10 10:05:33.719360] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:11.919 [2024-06-10 10:05:33.719417] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid953251 ] 00:10:12.179 [2024-06-10 10:05:33.787565] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.179 [2024-06-10 10:05:33.851021] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:10:12.749 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:12.749 Dev_1 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:12.749 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:12.749 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 [ 00:10:13.010 { 00:10:13.010 "name": "Dev_1", 00:10:13.010 "aliases": [ 00:10:13.010 "40a21c41-2c63-4bd4-b7de-1799dee408c9" 00:10:13.010 ], 00:10:13.010 "product_name": "Malloc disk", 00:10:13.010 "block_size": 512, 00:10:13.010 "num_blocks": 262144, 00:10:13.010 "uuid": "40a21c41-2c63-4bd4-b7de-1799dee408c9", 00:10:13.010 "assigned_rate_limits": { 00:10:13.010 "rw_ios_per_sec": 0, 00:10:13.010 "rw_mbytes_per_sec": 0, 00:10:13.010 "r_mbytes_per_sec": 0, 00:10:13.010 "w_mbytes_per_sec": 0 00:10:13.010 }, 00:10:13.010 "claimed": false, 00:10:13.010 "zoned": false, 00:10:13.010 "supported_io_types": { 00:10:13.010 "read": true, 00:10:13.010 "write": true, 00:10:13.010 "unmap": true, 00:10:13.010 "write_zeroes": true, 00:10:13.010 "flush": true, 00:10:13.010 "reset": true, 00:10:13.010 "compare": false, 00:10:13.010 "compare_and_write": false, 00:10:13.010 "abort": true, 00:10:13.010 "nvme_admin": false, 00:10:13.010 "nvme_io": false 00:10:13.010 }, 00:10:13.010 "memory_domains": [ 00:10:13.010 { 00:10:13.010 "dma_device_id": "system", 00:10:13.010 "dma_device_type": 1 00:10:13.010 }, 00:10:13.010 { 00:10:13.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.010 "dma_device_type": 2 00:10:13.010 } 00:10:13.010 ], 00:10:13.010 "driver_specific": {} 00:10:13.010 } 00:10:13.010 ] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 true 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 Dev_2 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 [ 00:10:13.010 { 00:10:13.010 "name": "Dev_2", 00:10:13.010 "aliases": [ 00:10:13.010 "3491cd99-1401-46c0-8155-4f121d500b16" 00:10:13.010 ], 00:10:13.010 "product_name": "Malloc disk", 00:10:13.010 "block_size": 512, 00:10:13.010 "num_blocks": 262144, 00:10:13.010 "uuid": "3491cd99-1401-46c0-8155-4f121d500b16", 00:10:13.010 "assigned_rate_limits": { 00:10:13.010 "rw_ios_per_sec": 0, 00:10:13.010 "rw_mbytes_per_sec": 0, 00:10:13.010 "r_mbytes_per_sec": 0, 00:10:13.010 "w_mbytes_per_sec": 0 00:10:13.010 }, 00:10:13.010 "claimed": false, 00:10:13.010 "zoned": false, 00:10:13.010 "supported_io_types": { 00:10:13.010 "read": true, 00:10:13.010 "write": true, 00:10:13.010 "unmap": true, 00:10:13.010 "write_zeroes": true, 00:10:13.010 "flush": true, 00:10:13.010 "reset": true, 00:10:13.010 "compare": false, 00:10:13.010 "compare_and_write": false, 00:10:13.010 "abort": true, 00:10:13.010 "nvme_admin": false, 00:10:13.010 "nvme_io": false 00:10:13.010 }, 00:10:13.010 "memory_domains": [ 00:10:13.010 { 00:10:13.010 "dma_device_id": "system", 00:10:13.010 "dma_device_type": 1 00:10:13.010 }, 00:10:13.010 { 00:10:13.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.010 "dma_device_type": 2 00:10:13.010 } 00:10:13.010 ], 00:10:13.010 "driver_specific": {} 00:10:13.010 } 00:10:13.010 ] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 953251 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@649 -- # local es=0 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # valid_exec_arg wait 953251 00:10:13.010 10:05:34 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@637 -- # local arg=wait 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # type -t wait 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:13.010 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # wait 953251 00:10:13.010 Running I/O for 5 seconds... 00:10:13.010 task offset: 112080 on job bdev=EE_Dev_1 fails 00:10:13.010 00:10:13.010 Latency(us) 00:10:13.010 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:13.010 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:13.010 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:13.010 EE_Dev_1 : 0.00 36184.21 141.34 8223.68 0.00 299.63 110.28 532.48 00:10:13.010 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:13.010 Dev_2 : 0.00 21753.91 84.98 0.00 0.00 550.45 107.13 1027.15 00:10:13.010 =================================================================================================================== 00:10:13.010 Total : 57938.12 226.32 8223.68 0.00 435.67 107.13 1027.15 00:10:13.010 [2024-06-10 10:05:34.811691] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:13.010 request: 00:10:13.010 { 00:10:13.010 "method": "perform_tests", 00:10:13.010 "req_id": 1 00:10:13.010 } 00:10:13.010 Got JSON-RPC error response 00:10:13.010 response: 00:10:13.010 { 00:10:13.010 "code": -32603, 00:10:13.010 "message": "bdevperf failed with error Operation not permitted" 00:10:13.010 } 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # es=255 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # es=127 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # case "$es" in 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@669 -- # es=1 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:13.271 00:10:13.271 real 0m8.542s 00:10:13.271 user 0m9.035s 00:10:13.271 sys 0m0.549s 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:13.271 10:05:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.271 ************************************ 00:10:13.271 END TEST bdev_error 00:10:13.271 ************************************ 00:10:13.271 10:05:35 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:13.271 10:05:35 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:10:13.271 10:05:35 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:13.271 10:05:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:13.271 ************************************ 00:10:13.271 START TEST bdev_stat 00:10:13.271 ************************************ 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # stat_test_suite '' 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=953558 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 953558' 00:10:13.271 Process Bdev IO statistics testing pid: 953558 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 953558 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@830 -- # '[' -z 953558 ']' 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:13.271 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:13.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:13.272 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:13.272 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:13.272 [2024-06-10 10:05:35.096867] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:13.272 [2024-06-10 10:05:35.096914] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid953558 ] 00:10:13.532 [2024-06-10 10:05:35.184338] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:13.532 [2024-06-10 10:05:35.276871] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:10:13.532 [2024-06-10 10:05:35.276952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@863 -- # return 0 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:14.103 Malloc_STAT 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_STAT 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local i 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:14.103 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:14.364 [ 00:10:14.364 { 00:10:14.364 "name": "Malloc_STAT", 00:10:14.364 "aliases": [ 00:10:14.364 "5a342369-307a-4e83-a312-b3efe6d9588c" 00:10:14.364 ], 00:10:14.364 "product_name": "Malloc disk", 00:10:14.364 "block_size": 512, 00:10:14.364 "num_blocks": 262144, 00:10:14.364 "uuid": "5a342369-307a-4e83-a312-b3efe6d9588c", 00:10:14.364 "assigned_rate_limits": { 00:10:14.364 "rw_ios_per_sec": 0, 00:10:14.364 "rw_mbytes_per_sec": 0, 00:10:14.364 "r_mbytes_per_sec": 0, 00:10:14.364 "w_mbytes_per_sec": 0 00:10:14.364 }, 00:10:14.364 "claimed": false, 00:10:14.364 "zoned": false, 00:10:14.364 "supported_io_types": { 00:10:14.364 "read": true, 00:10:14.364 "write": true, 00:10:14.364 "unmap": true, 00:10:14.364 "write_zeroes": true, 00:10:14.364 "flush": true, 00:10:14.364 "reset": true, 00:10:14.364 "compare": false, 00:10:14.364 "compare_and_write": false, 00:10:14.364 "abort": true, 00:10:14.364 "nvme_admin": false, 00:10:14.364 "nvme_io": false 00:10:14.364 }, 00:10:14.364 "memory_domains": [ 00:10:14.364 { 00:10:14.364 "dma_device_id": "system", 00:10:14.364 "dma_device_type": 1 00:10:14.364 }, 00:10:14.364 { 00:10:14.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.364 "dma_device_type": 2 00:10:14.364 } 00:10:14.364 ], 00:10:14.364 "driver_specific": {} 00:10:14.364 } 00:10:14.364 ] 00:10:14.364 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:14.364 10:05:35 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # return 0 00:10:14.364 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:14.364 10:05:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:14.364 Running I/O for 10 seconds... 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:16.276 10:05:37 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:16.276 "tick_rate": 2600000000, 00:10:16.276 "ticks": 4715955254903751, 00:10:16.276 "bdevs": [ 00:10:16.276 { 00:10:16.276 "name": "Malloc_STAT", 00:10:16.276 "bytes_read": 999338496, 00:10:16.276 "num_read_ops": 243972, 00:10:16.276 "bytes_written": 0, 00:10:16.276 "num_write_ops": 0, 00:10:16.276 "bytes_unmapped": 0, 00:10:16.276 "num_unmap_ops": 0, 00:10:16.276 "bytes_copied": 0, 00:10:16.276 "num_copy_ops": 0, 00:10:16.276 "read_latency_ticks": 2549536995256, 00:10:16.276 "max_read_latency_ticks": 14524050, 00:10:16.276 "min_read_latency_ticks": 221856, 00:10:16.276 "write_latency_ticks": 0, 00:10:16.276 "max_write_latency_ticks": 0, 00:10:16.276 "min_write_latency_ticks": 0, 00:10:16.276 "unmap_latency_ticks": 0, 00:10:16.276 "max_unmap_latency_ticks": 0, 00:10:16.276 "min_unmap_latency_ticks": 0, 00:10:16.276 "copy_latency_ticks": 0, 00:10:16.276 "max_copy_latency_ticks": 0, 00:10:16.276 "min_copy_latency_ticks": 0, 00:10:16.276 "io_error": {} 00:10:16.276 } 00:10:16.276 ] 00:10:16.276 }' 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=243972 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:16.276 "tick_rate": 2600000000, 00:10:16.276 "ticks": 4715955436513411, 00:10:16.276 "name": "Malloc_STAT", 00:10:16.276 "channels": [ 00:10:16.276 { 00:10:16.276 "thread_id": 2, 00:10:16.276 "bytes_read": 514850816, 00:10:16.276 "num_read_ops": 125696, 00:10:16.276 "bytes_written": 0, 00:10:16.276 "num_write_ops": 0, 00:10:16.276 "bytes_unmapped": 0, 00:10:16.276 "num_unmap_ops": 0, 00:10:16.276 "bytes_copied": 0, 00:10:16.276 "num_copy_ops": 0, 00:10:16.276 "read_latency_ticks": 1320142771088, 00:10:16.276 "max_read_latency_ticks": 14524050, 00:10:16.276 "min_read_latency_ticks": 6437144, 00:10:16.276 "write_latency_ticks": 0, 00:10:16.276 "max_write_latency_ticks": 0, 00:10:16.276 "min_write_latency_ticks": 0, 00:10:16.276 "unmap_latency_ticks": 0, 00:10:16.276 "max_unmap_latency_ticks": 0, 00:10:16.276 "min_unmap_latency_ticks": 0, 00:10:16.276 "copy_latency_ticks": 0, 00:10:16.276 "max_copy_latency_ticks": 0, 00:10:16.276 "min_copy_latency_ticks": 0 00:10:16.276 }, 00:10:16.276 { 00:10:16.276 "thread_id": 3, 00:10:16.276 "bytes_read": 520093696, 00:10:16.276 "num_read_ops": 126976, 00:10:16.276 "bytes_written": 0, 00:10:16.276 "num_write_ops": 0, 00:10:16.276 "bytes_unmapped": 0, 00:10:16.276 "num_unmap_ops": 0, 00:10:16.276 "bytes_copied": 0, 00:10:16.276 "num_copy_ops": 0, 00:10:16.276 "read_latency_ticks": 1321570684264, 00:10:16.276 "max_read_latency_ticks": 14498758, 00:10:16.276 "min_read_latency_ticks": 6409494, 00:10:16.276 "write_latency_ticks": 0, 00:10:16.276 "max_write_latency_ticks": 0, 00:10:16.276 "min_write_latency_ticks": 0, 00:10:16.276 "unmap_latency_ticks": 0, 00:10:16.276 "max_unmap_latency_ticks": 0, 00:10:16.276 "min_unmap_latency_ticks": 0, 00:10:16.276 "copy_latency_ticks": 0, 00:10:16.276 "max_copy_latency_ticks": 0, 00:10:16.276 "min_copy_latency_ticks": 0 00:10:16.276 } 00:10:16.276 ] 00:10:16.276 }' 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=125696 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=125696 00:10:16.276 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:16.536 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=126976 00:10:16.536 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=252672 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:16.537 "tick_rate": 2600000000, 00:10:16.537 "ticks": 4715955717538083, 00:10:16.537 "bdevs": [ 00:10:16.537 { 00:10:16.537 "name": "Malloc_STAT", 00:10:16.537 "bytes_read": 1095807488, 00:10:16.537 "num_read_ops": 267524, 00:10:16.537 "bytes_written": 0, 00:10:16.537 "num_write_ops": 0, 00:10:16.537 "bytes_unmapped": 0, 00:10:16.537 "num_unmap_ops": 0, 00:10:16.537 "bytes_copied": 0, 00:10:16.537 "num_copy_ops": 0, 00:10:16.537 "read_latency_ticks": 2784722684790, 00:10:16.537 "max_read_latency_ticks": 14524050, 00:10:16.537 "min_read_latency_ticks": 221856, 00:10:16.537 "write_latency_ticks": 0, 00:10:16.537 "max_write_latency_ticks": 0, 00:10:16.537 "min_write_latency_ticks": 0, 00:10:16.537 "unmap_latency_ticks": 0, 00:10:16.537 "max_unmap_latency_ticks": 0, 00:10:16.537 "min_unmap_latency_ticks": 0, 00:10:16.537 "copy_latency_ticks": 0, 00:10:16.537 "max_copy_latency_ticks": 0, 00:10:16.537 "min_copy_latency_ticks": 0, 00:10:16.537 "io_error": {} 00:10:16.537 } 00:10:16.537 ] 00:10:16.537 }' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=267524 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 252672 -lt 243972 ']' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 252672 -gt 267524 ']' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.537 00:10:16.537 Latency(us) 00:10:16.537 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:16.537 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:16.537 Malloc_STAT : 2.17 63633.52 248.57 0.00 0.00 4015.47 907.42 5595.77 00:10:16.537 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:16.537 Malloc_STAT : 2.17 64277.92 251.09 0.00 0.00 3975.21 790.84 5595.77 00:10:16.537 =================================================================================================================== 00:10:16.537 Total : 127911.44 499.65 0.00 0.00 3995.23 790.84 5595.77 00:10:16.537 0 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 953558 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@949 -- # '[' -z 953558 ']' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # kill -0 953558 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # uname 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 953558 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 953558' 00:10:16.537 killing process with pid 953558 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # kill 953558 00:10:16.537 Received shutdown signal, test time was about 2.244083 seconds 00:10:16.537 00:10:16.537 Latency(us) 00:10:16.537 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:16.537 =================================================================================================================== 00:10:16.537 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:16.537 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@973 -- # wait 953558 00:10:16.863 10:05:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:16.863 00:10:16.863 real 0m3.404s 00:10:16.863 user 0m6.902s 00:10:16.863 sys 0m0.340s 00:10:16.863 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:16.863 10:05:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.863 ************************************ 00:10:16.863 END TEST bdev_stat 00:10:16.863 ************************************ 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:16.863 10:05:38 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:16.863 00:10:16.863 real 1m46.439s 00:10:16.863 user 7m14.349s 00:10:16.863 sys 0m15.109s 00:10:16.863 10:05:38 blockdev_general -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:16.863 10:05:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:16.863 ************************************ 00:10:16.863 END TEST blockdev_general 00:10:16.863 ************************************ 00:10:16.863 10:05:38 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:16.863 10:05:38 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:10:16.863 10:05:38 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:16.863 10:05:38 -- common/autotest_common.sh@10 -- # set +x 00:10:16.863 ************************************ 00:10:16.863 START TEST bdev_raid 00:10:16.863 ************************************ 00:10:16.863 10:05:38 bdev_raid -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:16.863 * Looking for test storage... 00:10:16.863 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:16.863 10:05:38 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:16.863 10:05:38 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:16.863 10:05:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:10:16.863 10:05:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:16.863 10:05:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:17.123 ************************************ 00:10:17.123 START TEST raid_function_test_raid0 00:10:17.123 ************************************ 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # raid_function_test raid0 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=954261 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 954261' 00:10:17.123 Process raid pid: 954261 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 954261 /var/tmp/spdk-raid.sock 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@830 -- # '[' -z 954261 ']' 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:17.123 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:17.123 10:05:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:17.123 [2024-06-10 10:05:38.791572] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:17.123 [2024-06-10 10:05:38.791618] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:17.123 [2024-06-10 10:05:38.862157] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:17.123 [2024-06-10 10:05:38.952055] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:17.383 [2024-06-10 10:05:39.006004] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:17.384 [2024-06-10 10:05:39.006025] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@863 -- # return 0 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:17.955 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:18.215 [2024-06-10 10:05:39.864623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:18.215 [2024-06-10 10:05:39.865955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:18.215 [2024-06-10 10:05:39.866014] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x800d40 00:10:18.215 [2024-06-10 10:05:39.866021] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:18.215 [2024-06-10 10:05:39.866187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8178e0 00:10:18.215 [2024-06-10 10:05:39.866292] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x800d40 00:10:18.215 [2024-06-10 10:05:39.866298] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x800d40 00:10:18.215 [2024-06-10 10:05:39.866385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:18.215 Base_1 00:10:18.215 Base_2 00:10:18.215 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:18.215 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:18.215 10:05:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:18.476 [2024-06-10 10:05:40.289730] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ac850 00:10:18.476 /dev/nbd0 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local i 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # break 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:18.476 1+0 records in 00:10:18.476 1+0 records out 00:10:18.476 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000541124 s, 7.6 MB/s 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # size=4096 00:10:18.476 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:18.736 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:10:18.736 10:05:40 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # return 0 00:10:18.736 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:18.736 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:18.737 { 00:10:18.737 "nbd_device": "/dev/nbd0", 00:10:18.737 "bdev_name": "raid" 00:10:18.737 } 00:10:18.737 ]' 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:18.737 { 00:10:18.737 "nbd_device": "/dev/nbd0", 00:10:18.737 "bdev_name": "raid" 00:10:18.737 } 00:10:18.737 ]' 00:10:18.737 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:18.997 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:18.997 4096+0 records in 00:10:18.997 4096+0 records out 00:10:18.998 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0274 s, 76.5 MB/s 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:18.998 4096+0 records in 00:10:18.998 4096+0 records out 00:10:18.998 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.181963 s, 11.5 MB/s 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:18.998 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:18.998 128+0 records in 00:10:18.998 128+0 records out 00:10:18.998 65536 bytes (66 kB, 64 KiB) copied, 0.000362202 s, 181 MB/s 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:19.259 2035+0 records in 00:10:19.259 2035+0 records out 00:10:19.259 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00339034 s, 307 MB/s 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:19.259 456+0 records in 00:10:19.259 456+0 records out 00:10:19.259 233472 bytes (233 kB, 228 KiB) copied, 0.0011305 s, 207 MB/s 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:19.259 10:05:40 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:19.259 [2024-06-10 10:05:41.119451] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.259 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 954261 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@949 -- # '[' -z 954261 ']' 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # kill -0 954261 00:10:19.519 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # uname 00:10:19.520 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 954261 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 954261' 00:10:19.780 killing process with pid 954261 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # kill 954261 00:10:19.780 [2024-06-10 10:05:41.432655] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:19.780 [2024-06-10 10:05:41.432704] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:19.780 [2024-06-10 10:05:41.432733] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:19.780 [2024-06-10 10:05:41.432738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x800d40 name raid, state offline 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@973 -- # wait 954261 00:10:19.780 [2024-06-10 10:05:41.444829] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:19.780 00:10:19.780 real 0m2.837s 00:10:19.780 user 0m3.952s 00:10:19.780 sys 0m0.823s 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:19.780 10:05:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:19.780 ************************************ 00:10:19.780 END TEST raid_function_test_raid0 00:10:19.780 ************************************ 00:10:19.780 10:05:41 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:19.780 10:05:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:10:19.780 10:05:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:19.780 10:05:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:19.780 ************************************ 00:10:19.780 START TEST raid_function_test_concat 00:10:19.780 ************************************ 00:10:19.780 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # raid_function_test concat 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=954951 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 954951' 00:10:20.041 Process raid pid: 954951 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 954951 /var/tmp/spdk-raid.sock 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@830 -- # '[' -z 954951 ']' 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:20.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:20.041 10:05:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:20.041 [2024-06-10 10:05:41.679756] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:20.041 [2024-06-10 10:05:41.679800] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:20.041 [2024-06-10 10:05:41.764151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:20.041 [2024-06-10 10:05:41.838634] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:20.041 [2024-06-10 10:05:41.888141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:20.041 [2024-06-10 10:05:41.888168] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@863 -- # return 0 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:20.984 [2024-06-10 10:05:42.768268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:20.984 [2024-06-10 10:05:42.769614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:20.984 [2024-06-10 10:05:42.769675] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21a6d40 00:10:20.984 [2024-06-10 10:05:42.769682] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:20.984 [2024-06-10 10:05:42.769864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bd7c0 00:10:20.984 [2024-06-10 10:05:42.769971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21a6d40 00:10:20.984 [2024-06-10 10:05:42.769977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x21a6d40 00:10:20.984 [2024-06-10 10:05:42.770064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.984 Base_1 00:10:20.984 Base_2 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:20.984 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:21.245 10:05:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:21.506 [2024-06-10 10:05:43.121190] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a7d40 00:10:21.506 /dev/nbd0 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local i 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # break 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.506 1+0 records in 00:10:21.506 1+0 records out 00:10:21.506 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237906 s, 17.2 MB/s 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # size=4096 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # return 0 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.506 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:21.767 { 00:10:21.767 "nbd_device": "/dev/nbd0", 00:10:21.767 "bdev_name": "raid" 00:10:21.767 } 00:10:21.767 ]' 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:21.767 { 00:10:21.767 "nbd_device": "/dev/nbd0", 00:10:21.767 "bdev_name": "raid" 00:10:21.767 } 00:10:21.767 ]' 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:21.767 4096+0 records in 00:10:21.767 4096+0 records out 00:10:21.767 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0277887 s, 75.5 MB/s 00:10:21.767 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:22.028 4096+0 records in 00:10:22.028 4096+0 records out 00:10:22.028 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.179392 s, 11.7 MB/s 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:22.028 128+0 records in 00:10:22.028 128+0 records out 00:10:22.028 65536 bytes (66 kB, 64 KiB) copied, 0.000378897 s, 173 MB/s 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:22.028 2035+0 records in 00:10:22.028 2035+0 records out 00:10:22.028 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0047545 s, 219 MB/s 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:22.028 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:22.029 456+0 records in 00:10:22.029 456+0 records out 00:10:22.029 233472 bytes (233 kB, 228 KiB) copied, 0.00113547 s, 206 MB/s 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:22.029 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:22.289 [2024-06-10 10:05:43.943289] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:22.289 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:22.290 10:05:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:22.290 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:22.290 10:05:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 954951 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@949 -- # '[' -z 954951 ']' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # kill -0 954951 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # uname 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 954951 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 954951' 00:10:22.550 killing process with pid 954951 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # kill 954951 00:10:22.550 [2024-06-10 10:05:44.279679] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:22.550 [2024-06-10 10:05:44.279738] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:22.550 [2024-06-10 10:05:44.279772] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:22.550 [2024-06-10 10:05:44.279779] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a6d40 name raid, state offline 00:10:22.550 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@973 -- # wait 954951 00:10:22.550 [2024-06-10 10:05:44.295707] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:22.810 10:05:44 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:22.810 00:10:22.810 real 0m2.796s 00:10:22.810 user 0m3.838s 00:10:22.810 sys 0m0.834s 00:10:22.810 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:22.810 10:05:44 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:22.810 ************************************ 00:10:22.810 END TEST raid_function_test_concat 00:10:22.810 ************************************ 00:10:22.810 10:05:44 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:22.810 10:05:44 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:10:22.810 10:05:44 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:22.810 10:05:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:22.810 ************************************ 00:10:22.810 START TEST raid0_resize_test 00:10:22.810 ************************************ 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # raid0_resize_test 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=955368 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 955368' 00:10:22.810 Process raid pid: 955368 00:10:22.810 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 955368 /var/tmp/spdk-raid.sock 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@830 -- # '[' -z 955368 ']' 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:22.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:22.811 10:05:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:22.811 [2024-06-10 10:05:44.576406] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:22.811 [2024-06-10 10:05:44.576463] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:22.811 [2024-06-10 10:05:44.668727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:23.070 [2024-06-10 10:05:44.738077] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:23.070 [2024-06-10 10:05:44.785414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.070 [2024-06-10 10:05:44.785439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.640 10:05:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:23.640 10:05:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@863 -- # return 0 00:10:23.640 10:05:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:23.900 Base_1 00:10:23.900 10:05:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:23.900 Base_2 00:10:24.160 10:05:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:24.160 [2024-06-10 10:05:45.950002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:24.160 [2024-06-10 10:05:45.951147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:24.160 [2024-06-10 10:05:45.951181] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x179dcc0 00:10:24.160 [2024-06-10 10:05:45.951186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:24.160 [2024-06-10 10:05:45.951342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19516a0 00:10:24.160 [2024-06-10 10:05:45.951413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x179dcc0 00:10:24.160 [2024-06-10 10:05:45.951418] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x179dcc0 00:10:24.160 [2024-06-10 10:05:45.951490] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:24.160 10:05:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:24.421 [2024-06-10 10:05:46.122418] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:24.421 [2024-06-10 10:05:46.122428] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:24.421 true 00:10:24.421 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:24.421 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:24.682 [2024-06-10 10:05:46.310990] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:24.682 [2024-06-10 10:05:46.487318] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:24.682 [2024-06-10 10:05:46.487327] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:24.682 [2024-06-10 10:05:46.487340] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:24.682 true 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:24.682 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:24.942 [2024-06-10 10:05:46.679915] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 955368 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@949 -- # '[' -z 955368 ']' 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # kill -0 955368 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # uname 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 955368 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 955368' 00:10:24.942 killing process with pid 955368 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # kill 955368 00:10:24.942 [2024-06-10 10:05:46.750479] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:24.942 [2024-06-10 10:05:46.750516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:24.942 [2024-06-10 10:05:46.750547] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:24.942 [2024-06-10 10:05:46.750553] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179dcc0 name Raid, state offline 00:10:24.942 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@973 -- # wait 955368 00:10:24.942 [2024-06-10 10:05:46.751439] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:25.202 10:05:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:25.202 00:10:25.202 real 0m2.340s 00:10:25.202 user 0m3.650s 00:10:25.202 sys 0m0.413s 00:10:25.202 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:25.202 10:05:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.202 ************************************ 00:10:25.202 END TEST raid0_resize_test 00:10:25.202 ************************************ 00:10:25.202 10:05:46 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:25.202 10:05:46 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:25.202 10:05:46 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:25.202 10:05:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:25.202 10:05:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:25.202 10:05:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:25.202 ************************************ 00:10:25.202 START TEST raid_state_function_test 00:10:25.202 ************************************ 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 false 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:25.202 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=955987 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 955987' 00:10:25.203 Process raid pid: 955987 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 955987 /var/tmp/spdk-raid.sock 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 955987 ']' 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:25.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:25.203 10:05:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.203 [2024-06-10 10:05:46.995645] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:25.203 [2024-06-10 10:05:46.995689] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:25.463 [2024-06-10 10:05:47.083449] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:25.463 [2024-06-10 10:05:47.145684] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:25.463 [2024-06-10 10:05:47.185609] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:25.463 [2024-06-10 10:05:47.185631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.034 10:05:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:26.034 10:05:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:10:26.034 10:05:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:26.294 [2024-06-10 10:05:47.996456] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:26.294 [2024-06-10 10:05:47.996487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:26.294 [2024-06-10 10:05:47.996493] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:26.294 [2024-06-10 10:05:47.996499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:26.294 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:26.554 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:26.554 "name": "Existed_Raid", 00:10:26.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:26.554 "strip_size_kb": 64, 00:10:26.554 "state": "configuring", 00:10:26.554 "raid_level": "raid0", 00:10:26.554 "superblock": false, 00:10:26.554 "num_base_bdevs": 2, 00:10:26.554 "num_base_bdevs_discovered": 0, 00:10:26.554 "num_base_bdevs_operational": 2, 00:10:26.554 "base_bdevs_list": [ 00:10:26.554 { 00:10:26.554 "name": "BaseBdev1", 00:10:26.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:26.554 "is_configured": false, 00:10:26.554 "data_offset": 0, 00:10:26.554 "data_size": 0 00:10:26.554 }, 00:10:26.554 { 00:10:26.554 "name": "BaseBdev2", 00:10:26.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:26.554 "is_configured": false, 00:10:26.554 "data_offset": 0, 00:10:26.554 "data_size": 0 00:10:26.554 } 00:10:26.554 ] 00:10:26.554 }' 00:10:26.554 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:26.554 10:05:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:27.125 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:27.125 [2024-06-10 10:05:48.914677] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:27.125 [2024-06-10 10:05:48.914694] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd6ab00 name Existed_Raid, state configuring 00:10:27.125 10:05:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:27.385 [2024-06-10 10:05:49.107177] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:27.385 [2024-06-10 10:05:49.107194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:27.385 [2024-06-10 10:05:49.107200] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:27.385 [2024-06-10 10:05:49.107210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:27.385 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:27.646 [2024-06-10 10:05:49.298296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:27.646 BaseBdev1 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:27.646 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:27.908 [ 00:10:27.908 { 00:10:27.908 "name": "BaseBdev1", 00:10:27.908 "aliases": [ 00:10:27.908 "7be4f497-7c19-4e56-b201-b6a4ceb88f92" 00:10:27.908 ], 00:10:27.908 "product_name": "Malloc disk", 00:10:27.908 "block_size": 512, 00:10:27.908 "num_blocks": 65536, 00:10:27.908 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:27.908 "assigned_rate_limits": { 00:10:27.908 "rw_ios_per_sec": 0, 00:10:27.908 "rw_mbytes_per_sec": 0, 00:10:27.908 "r_mbytes_per_sec": 0, 00:10:27.908 "w_mbytes_per_sec": 0 00:10:27.908 }, 00:10:27.908 "claimed": true, 00:10:27.908 "claim_type": "exclusive_write", 00:10:27.908 "zoned": false, 00:10:27.908 "supported_io_types": { 00:10:27.908 "read": true, 00:10:27.908 "write": true, 00:10:27.908 "unmap": true, 00:10:27.908 "write_zeroes": true, 00:10:27.908 "flush": true, 00:10:27.908 "reset": true, 00:10:27.908 "compare": false, 00:10:27.908 "compare_and_write": false, 00:10:27.908 "abort": true, 00:10:27.908 "nvme_admin": false, 00:10:27.908 "nvme_io": false 00:10:27.908 }, 00:10:27.908 "memory_domains": [ 00:10:27.908 { 00:10:27.908 "dma_device_id": "system", 00:10:27.908 "dma_device_type": 1 00:10:27.908 }, 00:10:27.908 { 00:10:27.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.908 "dma_device_type": 2 00:10:27.908 } 00:10:27.908 ], 00:10:27.908 "driver_specific": {} 00:10:27.908 } 00:10:27.908 ] 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:27.908 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.909 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:28.169 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:28.169 "name": "Existed_Raid", 00:10:28.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.169 "strip_size_kb": 64, 00:10:28.169 "state": "configuring", 00:10:28.169 "raid_level": "raid0", 00:10:28.169 "superblock": false, 00:10:28.169 "num_base_bdevs": 2, 00:10:28.169 "num_base_bdevs_discovered": 1, 00:10:28.169 "num_base_bdevs_operational": 2, 00:10:28.169 "base_bdevs_list": [ 00:10:28.169 { 00:10:28.169 "name": "BaseBdev1", 00:10:28.169 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:28.169 "is_configured": true, 00:10:28.169 "data_offset": 0, 00:10:28.169 "data_size": 65536 00:10:28.169 }, 00:10:28.169 { 00:10:28.169 "name": "BaseBdev2", 00:10:28.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.169 "is_configured": false, 00:10:28.169 "data_offset": 0, 00:10:28.169 "data_size": 0 00:10:28.169 } 00:10:28.169 ] 00:10:28.169 }' 00:10:28.169 10:05:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:28.169 10:05:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.738 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:28.738 [2024-06-10 10:05:50.533413] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:28.738 [2024-06-10 10:05:50.533444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd6a3f0 name Existed_Raid, state configuring 00:10:28.738 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:28.998 [2024-06-10 10:05:50.725928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:28.998 [2024-06-10 10:05:50.727069] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:28.998 [2024-06-10 10:05:50.727093] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.998 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:29.258 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.258 "name": "Existed_Raid", 00:10:29.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.258 "strip_size_kb": 64, 00:10:29.258 "state": "configuring", 00:10:29.258 "raid_level": "raid0", 00:10:29.258 "superblock": false, 00:10:29.258 "num_base_bdevs": 2, 00:10:29.258 "num_base_bdevs_discovered": 1, 00:10:29.258 "num_base_bdevs_operational": 2, 00:10:29.258 "base_bdevs_list": [ 00:10:29.258 { 00:10:29.258 "name": "BaseBdev1", 00:10:29.258 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:29.258 "is_configured": true, 00:10:29.258 "data_offset": 0, 00:10:29.258 "data_size": 65536 00:10:29.258 }, 00:10:29.258 { 00:10:29.258 "name": "BaseBdev2", 00:10:29.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.258 "is_configured": false, 00:10:29.258 "data_offset": 0, 00:10:29.258 "data_size": 0 00:10:29.258 } 00:10:29.258 ] 00:10:29.258 }' 00:10:29.258 10:05:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.258 10:05:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:29.828 [2024-06-10 10:05:51.645075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:29.828 [2024-06-10 10:05:51.645096] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd6b1c0 00:10:29.828 [2024-06-10 10:05:51.645101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:29.828 [2024-06-10 10:05:51.645244] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1e220 00:10:29.828 [2024-06-10 10:05:51.645334] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd6b1c0 00:10:29.828 [2024-06-10 10:05:51.645340] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd6b1c0 00:10:29.828 [2024-06-10 10:05:51.645460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:29.828 BaseBdev2 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:29.828 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:30.089 10:05:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:30.350 [ 00:10:30.350 { 00:10:30.350 "name": "BaseBdev2", 00:10:30.350 "aliases": [ 00:10:30.350 "684abb57-272a-41c6-b047-3c0aa166e137" 00:10:30.350 ], 00:10:30.350 "product_name": "Malloc disk", 00:10:30.350 "block_size": 512, 00:10:30.350 "num_blocks": 65536, 00:10:30.350 "uuid": "684abb57-272a-41c6-b047-3c0aa166e137", 00:10:30.350 "assigned_rate_limits": { 00:10:30.350 "rw_ios_per_sec": 0, 00:10:30.350 "rw_mbytes_per_sec": 0, 00:10:30.350 "r_mbytes_per_sec": 0, 00:10:30.350 "w_mbytes_per_sec": 0 00:10:30.350 }, 00:10:30.350 "claimed": true, 00:10:30.350 "claim_type": "exclusive_write", 00:10:30.350 "zoned": false, 00:10:30.350 "supported_io_types": { 00:10:30.350 "read": true, 00:10:30.350 "write": true, 00:10:30.350 "unmap": true, 00:10:30.350 "write_zeroes": true, 00:10:30.350 "flush": true, 00:10:30.350 "reset": true, 00:10:30.350 "compare": false, 00:10:30.350 "compare_and_write": false, 00:10:30.350 "abort": true, 00:10:30.350 "nvme_admin": false, 00:10:30.350 "nvme_io": false 00:10:30.350 }, 00:10:30.350 "memory_domains": [ 00:10:30.350 { 00:10:30.350 "dma_device_id": "system", 00:10:30.350 "dma_device_type": 1 00:10:30.350 }, 00:10:30.350 { 00:10:30.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.350 "dma_device_type": 2 00:10:30.350 } 00:10:30.350 ], 00:10:30.350 "driver_specific": {} 00:10:30.350 } 00:10:30.350 ] 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.350 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.610 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.610 "name": "Existed_Raid", 00:10:30.610 "uuid": "e5eef6b5-1e1e-41a5-a063-7ac388be6beb", 00:10:30.610 "strip_size_kb": 64, 00:10:30.610 "state": "online", 00:10:30.610 "raid_level": "raid0", 00:10:30.610 "superblock": false, 00:10:30.610 "num_base_bdevs": 2, 00:10:30.610 "num_base_bdevs_discovered": 2, 00:10:30.610 "num_base_bdevs_operational": 2, 00:10:30.610 "base_bdevs_list": [ 00:10:30.610 { 00:10:30.610 "name": "BaseBdev1", 00:10:30.610 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:30.610 "is_configured": true, 00:10:30.610 "data_offset": 0, 00:10:30.610 "data_size": 65536 00:10:30.610 }, 00:10:30.610 { 00:10:30.610 "name": "BaseBdev2", 00:10:30.610 "uuid": "684abb57-272a-41c6-b047-3c0aa166e137", 00:10:30.610 "is_configured": true, 00:10:30.610 "data_offset": 0, 00:10:30.610 "data_size": 65536 00:10:30.610 } 00:10:30.610 ] 00:10:30.610 }' 00:10:30.610 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.610 10:05:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:31.181 [2024-06-10 10:05:52.892420] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:31.181 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:31.181 "name": "Existed_Raid", 00:10:31.181 "aliases": [ 00:10:31.181 "e5eef6b5-1e1e-41a5-a063-7ac388be6beb" 00:10:31.181 ], 00:10:31.181 "product_name": "Raid Volume", 00:10:31.181 "block_size": 512, 00:10:31.181 "num_blocks": 131072, 00:10:31.181 "uuid": "e5eef6b5-1e1e-41a5-a063-7ac388be6beb", 00:10:31.181 "assigned_rate_limits": { 00:10:31.181 "rw_ios_per_sec": 0, 00:10:31.181 "rw_mbytes_per_sec": 0, 00:10:31.181 "r_mbytes_per_sec": 0, 00:10:31.181 "w_mbytes_per_sec": 0 00:10:31.181 }, 00:10:31.181 "claimed": false, 00:10:31.181 "zoned": false, 00:10:31.181 "supported_io_types": { 00:10:31.181 "read": true, 00:10:31.181 "write": true, 00:10:31.181 "unmap": true, 00:10:31.181 "write_zeroes": true, 00:10:31.181 "flush": true, 00:10:31.181 "reset": true, 00:10:31.181 "compare": false, 00:10:31.181 "compare_and_write": false, 00:10:31.181 "abort": false, 00:10:31.181 "nvme_admin": false, 00:10:31.181 "nvme_io": false 00:10:31.181 }, 00:10:31.181 "memory_domains": [ 00:10:31.181 { 00:10:31.181 "dma_device_id": "system", 00:10:31.181 "dma_device_type": 1 00:10:31.181 }, 00:10:31.181 { 00:10:31.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.181 "dma_device_type": 2 00:10:31.181 }, 00:10:31.181 { 00:10:31.181 "dma_device_id": "system", 00:10:31.181 "dma_device_type": 1 00:10:31.181 }, 00:10:31.181 { 00:10:31.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.181 "dma_device_type": 2 00:10:31.181 } 00:10:31.181 ], 00:10:31.181 "driver_specific": { 00:10:31.181 "raid": { 00:10:31.181 "uuid": "e5eef6b5-1e1e-41a5-a063-7ac388be6beb", 00:10:31.181 "strip_size_kb": 64, 00:10:31.181 "state": "online", 00:10:31.181 "raid_level": "raid0", 00:10:31.181 "superblock": false, 00:10:31.181 "num_base_bdevs": 2, 00:10:31.181 "num_base_bdevs_discovered": 2, 00:10:31.181 "num_base_bdevs_operational": 2, 00:10:31.181 "base_bdevs_list": [ 00:10:31.181 { 00:10:31.181 "name": "BaseBdev1", 00:10:31.181 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:31.182 "is_configured": true, 00:10:31.182 "data_offset": 0, 00:10:31.182 "data_size": 65536 00:10:31.182 }, 00:10:31.182 { 00:10:31.182 "name": "BaseBdev2", 00:10:31.182 "uuid": "684abb57-272a-41c6-b047-3c0aa166e137", 00:10:31.182 "is_configured": true, 00:10:31.182 "data_offset": 0, 00:10:31.182 "data_size": 65536 00:10:31.182 } 00:10:31.182 ] 00:10:31.182 } 00:10:31.182 } 00:10:31.182 }' 00:10:31.182 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:31.182 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:31.182 BaseBdev2' 00:10:31.182 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:31.182 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:31.182 10:05:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:31.442 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:31.442 "name": "BaseBdev1", 00:10:31.442 "aliases": [ 00:10:31.442 "7be4f497-7c19-4e56-b201-b6a4ceb88f92" 00:10:31.442 ], 00:10:31.442 "product_name": "Malloc disk", 00:10:31.442 "block_size": 512, 00:10:31.442 "num_blocks": 65536, 00:10:31.442 "uuid": "7be4f497-7c19-4e56-b201-b6a4ceb88f92", 00:10:31.442 "assigned_rate_limits": { 00:10:31.442 "rw_ios_per_sec": 0, 00:10:31.442 "rw_mbytes_per_sec": 0, 00:10:31.442 "r_mbytes_per_sec": 0, 00:10:31.442 "w_mbytes_per_sec": 0 00:10:31.442 }, 00:10:31.442 "claimed": true, 00:10:31.442 "claim_type": "exclusive_write", 00:10:31.442 "zoned": false, 00:10:31.442 "supported_io_types": { 00:10:31.442 "read": true, 00:10:31.442 "write": true, 00:10:31.442 "unmap": true, 00:10:31.442 "write_zeroes": true, 00:10:31.443 "flush": true, 00:10:31.443 "reset": true, 00:10:31.443 "compare": false, 00:10:31.443 "compare_and_write": false, 00:10:31.443 "abort": true, 00:10:31.443 "nvme_admin": false, 00:10:31.443 "nvme_io": false 00:10:31.443 }, 00:10:31.443 "memory_domains": [ 00:10:31.443 { 00:10:31.443 "dma_device_id": "system", 00:10:31.443 "dma_device_type": 1 00:10:31.443 }, 00:10:31.443 { 00:10:31.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.443 "dma_device_type": 2 00:10:31.443 } 00:10:31.443 ], 00:10:31.443 "driver_specific": {} 00:10:31.443 }' 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:31.443 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:31.704 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:31.964 "name": "BaseBdev2", 00:10:31.964 "aliases": [ 00:10:31.964 "684abb57-272a-41c6-b047-3c0aa166e137" 00:10:31.964 ], 00:10:31.964 "product_name": "Malloc disk", 00:10:31.964 "block_size": 512, 00:10:31.964 "num_blocks": 65536, 00:10:31.964 "uuid": "684abb57-272a-41c6-b047-3c0aa166e137", 00:10:31.964 "assigned_rate_limits": { 00:10:31.964 "rw_ios_per_sec": 0, 00:10:31.964 "rw_mbytes_per_sec": 0, 00:10:31.964 "r_mbytes_per_sec": 0, 00:10:31.964 "w_mbytes_per_sec": 0 00:10:31.964 }, 00:10:31.964 "claimed": true, 00:10:31.964 "claim_type": "exclusive_write", 00:10:31.964 "zoned": false, 00:10:31.964 "supported_io_types": { 00:10:31.964 "read": true, 00:10:31.964 "write": true, 00:10:31.964 "unmap": true, 00:10:31.964 "write_zeroes": true, 00:10:31.964 "flush": true, 00:10:31.964 "reset": true, 00:10:31.964 "compare": false, 00:10:31.964 "compare_and_write": false, 00:10:31.964 "abort": true, 00:10:31.964 "nvme_admin": false, 00:10:31.964 "nvme_io": false 00:10:31.964 }, 00:10:31.964 "memory_domains": [ 00:10:31.964 { 00:10:31.964 "dma_device_id": "system", 00:10:31.964 "dma_device_type": 1 00:10:31.964 }, 00:10:31.964 { 00:10:31.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.964 "dma_device_type": 2 00:10:31.964 } 00:10:31.964 ], 00:10:31.964 "driver_specific": {} 00:10:31.964 }' 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.964 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:32.225 10:05:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:32.485 [2024-06-10 10:05:54.171509] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:32.485 [2024-06-10 10:05:54.171528] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:32.485 [2024-06-10 10:05:54.171558] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:32.485 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.486 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.746 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.746 "name": "Existed_Raid", 00:10:32.746 "uuid": "e5eef6b5-1e1e-41a5-a063-7ac388be6beb", 00:10:32.746 "strip_size_kb": 64, 00:10:32.746 "state": "offline", 00:10:32.746 "raid_level": "raid0", 00:10:32.746 "superblock": false, 00:10:32.746 "num_base_bdevs": 2, 00:10:32.746 "num_base_bdevs_discovered": 1, 00:10:32.746 "num_base_bdevs_operational": 1, 00:10:32.746 "base_bdevs_list": [ 00:10:32.746 { 00:10:32.746 "name": null, 00:10:32.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.746 "is_configured": false, 00:10:32.746 "data_offset": 0, 00:10:32.746 "data_size": 65536 00:10:32.746 }, 00:10:32.746 { 00:10:32.746 "name": "BaseBdev2", 00:10:32.746 "uuid": "684abb57-272a-41c6-b047-3c0aa166e137", 00:10:32.746 "is_configured": true, 00:10:32.746 "data_offset": 0, 00:10:32.746 "data_size": 65536 00:10:32.746 } 00:10:32.746 ] 00:10:32.746 }' 00:10:32.746 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.746 10:05:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.315 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:33.315 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:33.315 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.315 10:05:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:33.315 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:33.315 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:33.315 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:33.575 [2024-06-10 10:05:55.278305] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:33.575 [2024-06-10 10:05:55.278336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd6b1c0 name Existed_Raid, state offline 00:10:33.575 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:33.575 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:33.575 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.575 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:33.835 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 955987 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 955987 ']' 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 955987 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 955987 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 955987' 00:10:33.836 killing process with pid 955987 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 955987 00:10:33.836 [2024-06-10 10:05:55.536565] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 955987 00:10:33.836 [2024-06-10 10:05:55.537157] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:33.836 00:10:33.836 real 0m8.718s 00:10:33.836 user 0m15.848s 00:10:33.836 sys 0m1.323s 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:33.836 10:05:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.836 ************************************ 00:10:33.836 END TEST raid_state_function_test 00:10:33.836 ************************************ 00:10:33.836 10:05:55 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:33.836 10:05:55 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:33.836 10:05:55 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:33.836 10:05:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:34.097 ************************************ 00:10:34.097 START TEST raid_state_function_test_sb 00:10:34.097 ************************************ 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 true 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=957647 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 957647' 00:10:34.097 Process raid pid: 957647 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 957647 /var/tmp/spdk-raid.sock 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 957647 ']' 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:34.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:34.097 10:05:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:34.097 [2024-06-10 10:05:55.794259] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:34.097 [2024-06-10 10:05:55.794312] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:34.097 [2024-06-10 10:05:55.887889] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.357 [2024-06-10 10:05:55.963535] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.357 [2024-06-10 10:05:56.005956] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.357 [2024-06-10 10:05:56.005979] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.928 10:05:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:34.928 10:05:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:10:34.928 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:35.188 [2024-06-10 10:05:56.801554] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:35.188 [2024-06-10 10:05:56.801587] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:35.188 [2024-06-10 10:05:56.801593] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:35.188 [2024-06-10 10:05:56.801599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.188 "name": "Existed_Raid", 00:10:35.188 "uuid": "6f42ab8f-cc17-470a-8d4a-4a0477555461", 00:10:35.188 "strip_size_kb": 64, 00:10:35.188 "state": "configuring", 00:10:35.188 "raid_level": "raid0", 00:10:35.188 "superblock": true, 00:10:35.188 "num_base_bdevs": 2, 00:10:35.188 "num_base_bdevs_discovered": 0, 00:10:35.188 "num_base_bdevs_operational": 2, 00:10:35.188 "base_bdevs_list": [ 00:10:35.188 { 00:10:35.188 "name": "BaseBdev1", 00:10:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:35.188 "is_configured": false, 00:10:35.188 "data_offset": 0, 00:10:35.188 "data_size": 0 00:10:35.188 }, 00:10:35.188 { 00:10:35.188 "name": "BaseBdev2", 00:10:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:35.188 "is_configured": false, 00:10:35.188 "data_offset": 0, 00:10:35.188 "data_size": 0 00:10:35.188 } 00:10:35.188 ] 00:10:35.188 }' 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.188 10:05:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:35.757 10:05:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:36.017 [2024-06-10 10:05:57.679652] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:36.017 [2024-06-10 10:05:57.679669] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11cab00 name Existed_Raid, state configuring 00:10:36.017 10:05:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:36.017 [2024-06-10 10:05:57.868156] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:36.017 [2024-06-10 10:05:57.868172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:36.017 [2024-06-10 10:05:57.868177] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:36.017 [2024-06-10 10:05:57.868182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:36.277 10:05:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:36.277 [2024-06-10 10:05:58.063178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:36.277 BaseBdev1 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:36.277 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:36.537 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:36.797 [ 00:10:36.797 { 00:10:36.797 "name": "BaseBdev1", 00:10:36.797 "aliases": [ 00:10:36.797 "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d" 00:10:36.797 ], 00:10:36.797 "product_name": "Malloc disk", 00:10:36.797 "block_size": 512, 00:10:36.797 "num_blocks": 65536, 00:10:36.797 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:36.797 "assigned_rate_limits": { 00:10:36.797 "rw_ios_per_sec": 0, 00:10:36.797 "rw_mbytes_per_sec": 0, 00:10:36.797 "r_mbytes_per_sec": 0, 00:10:36.797 "w_mbytes_per_sec": 0 00:10:36.797 }, 00:10:36.797 "claimed": true, 00:10:36.797 "claim_type": "exclusive_write", 00:10:36.797 "zoned": false, 00:10:36.797 "supported_io_types": { 00:10:36.797 "read": true, 00:10:36.797 "write": true, 00:10:36.797 "unmap": true, 00:10:36.798 "write_zeroes": true, 00:10:36.798 "flush": true, 00:10:36.798 "reset": true, 00:10:36.798 "compare": false, 00:10:36.798 "compare_and_write": false, 00:10:36.798 "abort": true, 00:10:36.798 "nvme_admin": false, 00:10:36.798 "nvme_io": false 00:10:36.798 }, 00:10:36.798 "memory_domains": [ 00:10:36.798 { 00:10:36.798 "dma_device_id": "system", 00:10:36.798 "dma_device_type": 1 00:10:36.798 }, 00:10:36.798 { 00:10:36.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.798 "dma_device_type": 2 00:10:36.798 } 00:10:36.798 ], 00:10:36.798 "driver_specific": {} 00:10:36.798 } 00:10:36.798 ] 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:36.798 "name": "Existed_Raid", 00:10:36.798 "uuid": "d253a2ef-5c2b-4b7a-ac88-51ecd99e3bb6", 00:10:36.798 "strip_size_kb": 64, 00:10:36.798 "state": "configuring", 00:10:36.798 "raid_level": "raid0", 00:10:36.798 "superblock": true, 00:10:36.798 "num_base_bdevs": 2, 00:10:36.798 "num_base_bdevs_discovered": 1, 00:10:36.798 "num_base_bdevs_operational": 2, 00:10:36.798 "base_bdevs_list": [ 00:10:36.798 { 00:10:36.798 "name": "BaseBdev1", 00:10:36.798 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:36.798 "is_configured": true, 00:10:36.798 "data_offset": 2048, 00:10:36.798 "data_size": 63488 00:10:36.798 }, 00:10:36.798 { 00:10:36.798 "name": "BaseBdev2", 00:10:36.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.798 "is_configured": false, 00:10:36.798 "data_offset": 0, 00:10:36.798 "data_size": 0 00:10:36.798 } 00:10:36.798 ] 00:10:36.798 }' 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:36.798 10:05:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:37.369 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:37.630 [2024-06-10 10:05:59.286268] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:37.630 [2024-06-10 10:05:59.286293] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11ca3f0 name Existed_Raid, state configuring 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:37.630 [2024-06-10 10:05:59.474776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:37.630 [2024-06-10 10:05:59.475922] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:37.630 [2024-06-10 10:05:59.475948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.630 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.938 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:37.938 "name": "Existed_Raid", 00:10:37.938 "uuid": "6dbb3193-bef6-4191-99ab-772df294df4d", 00:10:37.938 "strip_size_kb": 64, 00:10:37.938 "state": "configuring", 00:10:37.938 "raid_level": "raid0", 00:10:37.938 "superblock": true, 00:10:37.938 "num_base_bdevs": 2, 00:10:37.938 "num_base_bdevs_discovered": 1, 00:10:37.938 "num_base_bdevs_operational": 2, 00:10:37.938 "base_bdevs_list": [ 00:10:37.938 { 00:10:37.938 "name": "BaseBdev1", 00:10:37.938 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:37.938 "is_configured": true, 00:10:37.938 "data_offset": 2048, 00:10:37.938 "data_size": 63488 00:10:37.938 }, 00:10:37.938 { 00:10:37.938 "name": "BaseBdev2", 00:10:37.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.938 "is_configured": false, 00:10:37.938 "data_offset": 0, 00:10:37.938 "data_size": 0 00:10:37.938 } 00:10:37.938 ] 00:10:37.938 }' 00:10:37.938 10:05:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:37.938 10:05:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:38.560 [2024-06-10 10:06:00.386209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:38.560 [2024-06-10 10:06:00.386319] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11cb1c0 00:10:38.560 [2024-06-10 10:06:00.386327] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:38.560 [2024-06-10 10:06:00.386465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137e220 00:10:38.560 [2024-06-10 10:06:00.386552] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11cb1c0 00:10:38.560 [2024-06-10 10:06:00.386557] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11cb1c0 00:10:38.560 [2024-06-10 10:06:00.386629] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.560 BaseBdev2 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:38.560 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:38.561 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:38.820 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:39.111 [ 00:10:39.111 { 00:10:39.111 "name": "BaseBdev2", 00:10:39.111 "aliases": [ 00:10:39.111 "ee93a898-4ada-498a-911a-2deb3bdd3243" 00:10:39.111 ], 00:10:39.111 "product_name": "Malloc disk", 00:10:39.111 "block_size": 512, 00:10:39.111 "num_blocks": 65536, 00:10:39.111 "uuid": "ee93a898-4ada-498a-911a-2deb3bdd3243", 00:10:39.111 "assigned_rate_limits": { 00:10:39.111 "rw_ios_per_sec": 0, 00:10:39.111 "rw_mbytes_per_sec": 0, 00:10:39.111 "r_mbytes_per_sec": 0, 00:10:39.111 "w_mbytes_per_sec": 0 00:10:39.111 }, 00:10:39.111 "claimed": true, 00:10:39.111 "claim_type": "exclusive_write", 00:10:39.111 "zoned": false, 00:10:39.111 "supported_io_types": { 00:10:39.111 "read": true, 00:10:39.111 "write": true, 00:10:39.111 "unmap": true, 00:10:39.111 "write_zeroes": true, 00:10:39.111 "flush": true, 00:10:39.111 "reset": true, 00:10:39.111 "compare": false, 00:10:39.111 "compare_and_write": false, 00:10:39.111 "abort": true, 00:10:39.111 "nvme_admin": false, 00:10:39.111 "nvme_io": false 00:10:39.111 }, 00:10:39.111 "memory_domains": [ 00:10:39.111 { 00:10:39.111 "dma_device_id": "system", 00:10:39.111 "dma_device_type": 1 00:10:39.111 }, 00:10:39.111 { 00:10:39.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.111 "dma_device_type": 2 00:10:39.111 } 00:10:39.111 ], 00:10:39.111 "driver_specific": {} 00:10:39.111 } 00:10:39.111 ] 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:39.111 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.111 "name": "Existed_Raid", 00:10:39.111 "uuid": "6dbb3193-bef6-4191-99ab-772df294df4d", 00:10:39.111 "strip_size_kb": 64, 00:10:39.111 "state": "online", 00:10:39.111 "raid_level": "raid0", 00:10:39.111 "superblock": true, 00:10:39.111 "num_base_bdevs": 2, 00:10:39.111 "num_base_bdevs_discovered": 2, 00:10:39.111 "num_base_bdevs_operational": 2, 00:10:39.112 "base_bdevs_list": [ 00:10:39.112 { 00:10:39.112 "name": "BaseBdev1", 00:10:39.112 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:39.112 "is_configured": true, 00:10:39.112 "data_offset": 2048, 00:10:39.112 "data_size": 63488 00:10:39.112 }, 00:10:39.112 { 00:10:39.112 "name": "BaseBdev2", 00:10:39.112 "uuid": "ee93a898-4ada-498a-911a-2deb3bdd3243", 00:10:39.112 "is_configured": true, 00:10:39.112 "data_offset": 2048, 00:10:39.112 "data_size": 63488 00:10:39.112 } 00:10:39.112 ] 00:10:39.112 }' 00:10:39.112 10:06:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.112 10:06:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:39.682 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:39.942 [2024-06-10 10:06:01.693711] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:39.942 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:39.942 "name": "Existed_Raid", 00:10:39.942 "aliases": [ 00:10:39.942 "6dbb3193-bef6-4191-99ab-772df294df4d" 00:10:39.942 ], 00:10:39.942 "product_name": "Raid Volume", 00:10:39.942 "block_size": 512, 00:10:39.942 "num_blocks": 126976, 00:10:39.942 "uuid": "6dbb3193-bef6-4191-99ab-772df294df4d", 00:10:39.942 "assigned_rate_limits": { 00:10:39.942 "rw_ios_per_sec": 0, 00:10:39.942 "rw_mbytes_per_sec": 0, 00:10:39.942 "r_mbytes_per_sec": 0, 00:10:39.942 "w_mbytes_per_sec": 0 00:10:39.942 }, 00:10:39.942 "claimed": false, 00:10:39.942 "zoned": false, 00:10:39.942 "supported_io_types": { 00:10:39.942 "read": true, 00:10:39.942 "write": true, 00:10:39.942 "unmap": true, 00:10:39.942 "write_zeroes": true, 00:10:39.942 "flush": true, 00:10:39.942 "reset": true, 00:10:39.942 "compare": false, 00:10:39.942 "compare_and_write": false, 00:10:39.943 "abort": false, 00:10:39.943 "nvme_admin": false, 00:10:39.943 "nvme_io": false 00:10:39.943 }, 00:10:39.943 "memory_domains": [ 00:10:39.943 { 00:10:39.943 "dma_device_id": "system", 00:10:39.943 "dma_device_type": 1 00:10:39.943 }, 00:10:39.943 { 00:10:39.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.943 "dma_device_type": 2 00:10:39.943 }, 00:10:39.943 { 00:10:39.943 "dma_device_id": "system", 00:10:39.943 "dma_device_type": 1 00:10:39.943 }, 00:10:39.943 { 00:10:39.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.943 "dma_device_type": 2 00:10:39.943 } 00:10:39.943 ], 00:10:39.943 "driver_specific": { 00:10:39.943 "raid": { 00:10:39.943 "uuid": "6dbb3193-bef6-4191-99ab-772df294df4d", 00:10:39.943 "strip_size_kb": 64, 00:10:39.943 "state": "online", 00:10:39.943 "raid_level": "raid0", 00:10:39.943 "superblock": true, 00:10:39.943 "num_base_bdevs": 2, 00:10:39.943 "num_base_bdevs_discovered": 2, 00:10:39.943 "num_base_bdevs_operational": 2, 00:10:39.943 "base_bdevs_list": [ 00:10:39.943 { 00:10:39.943 "name": "BaseBdev1", 00:10:39.943 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:39.943 "is_configured": true, 00:10:39.943 "data_offset": 2048, 00:10:39.943 "data_size": 63488 00:10:39.943 }, 00:10:39.943 { 00:10:39.943 "name": "BaseBdev2", 00:10:39.943 "uuid": "ee93a898-4ada-498a-911a-2deb3bdd3243", 00:10:39.943 "is_configured": true, 00:10:39.943 "data_offset": 2048, 00:10:39.943 "data_size": 63488 00:10:39.943 } 00:10:39.943 ] 00:10:39.943 } 00:10:39.943 } 00:10:39.943 }' 00:10:39.943 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:39.943 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:39.943 BaseBdev2' 00:10:39.943 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:39.943 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:39.943 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:40.203 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:40.203 "name": "BaseBdev1", 00:10:40.203 "aliases": [ 00:10:40.203 "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d" 00:10:40.203 ], 00:10:40.203 "product_name": "Malloc disk", 00:10:40.203 "block_size": 512, 00:10:40.203 "num_blocks": 65536, 00:10:40.203 "uuid": "fbb7f423-e4f6-4d0a-b9bb-21ff332b812d", 00:10:40.203 "assigned_rate_limits": { 00:10:40.203 "rw_ios_per_sec": 0, 00:10:40.203 "rw_mbytes_per_sec": 0, 00:10:40.203 "r_mbytes_per_sec": 0, 00:10:40.203 "w_mbytes_per_sec": 0 00:10:40.203 }, 00:10:40.203 "claimed": true, 00:10:40.203 "claim_type": "exclusive_write", 00:10:40.203 "zoned": false, 00:10:40.203 "supported_io_types": { 00:10:40.203 "read": true, 00:10:40.203 "write": true, 00:10:40.203 "unmap": true, 00:10:40.203 "write_zeroes": true, 00:10:40.203 "flush": true, 00:10:40.203 "reset": true, 00:10:40.203 "compare": false, 00:10:40.203 "compare_and_write": false, 00:10:40.203 "abort": true, 00:10:40.203 "nvme_admin": false, 00:10:40.203 "nvme_io": false 00:10:40.203 }, 00:10:40.203 "memory_domains": [ 00:10:40.203 { 00:10:40.203 "dma_device_id": "system", 00:10:40.203 "dma_device_type": 1 00:10:40.203 }, 00:10:40.203 { 00:10:40.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.203 "dma_device_type": 2 00:10:40.203 } 00:10:40.204 ], 00:10:40.204 "driver_specific": {} 00:10:40.204 }' 00:10:40.204 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.204 10:06:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.204 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:40.204 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:40.463 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.464 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.464 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:40.464 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:40.464 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:40.464 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:40.723 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:40.723 "name": "BaseBdev2", 00:10:40.723 "aliases": [ 00:10:40.723 "ee93a898-4ada-498a-911a-2deb3bdd3243" 00:10:40.723 ], 00:10:40.723 "product_name": "Malloc disk", 00:10:40.723 "block_size": 512, 00:10:40.723 "num_blocks": 65536, 00:10:40.723 "uuid": "ee93a898-4ada-498a-911a-2deb3bdd3243", 00:10:40.723 "assigned_rate_limits": { 00:10:40.723 "rw_ios_per_sec": 0, 00:10:40.724 "rw_mbytes_per_sec": 0, 00:10:40.724 "r_mbytes_per_sec": 0, 00:10:40.724 "w_mbytes_per_sec": 0 00:10:40.724 }, 00:10:40.724 "claimed": true, 00:10:40.724 "claim_type": "exclusive_write", 00:10:40.724 "zoned": false, 00:10:40.724 "supported_io_types": { 00:10:40.724 "read": true, 00:10:40.724 "write": true, 00:10:40.724 "unmap": true, 00:10:40.724 "write_zeroes": true, 00:10:40.724 "flush": true, 00:10:40.724 "reset": true, 00:10:40.724 "compare": false, 00:10:40.724 "compare_and_write": false, 00:10:40.724 "abort": true, 00:10:40.724 "nvme_admin": false, 00:10:40.724 "nvme_io": false 00:10:40.724 }, 00:10:40.724 "memory_domains": [ 00:10:40.724 { 00:10:40.724 "dma_device_id": "system", 00:10:40.724 "dma_device_type": 1 00:10:40.724 }, 00:10:40.724 { 00:10:40.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.724 "dma_device_type": 2 00:10:40.724 } 00:10:40.724 ], 00:10:40.724 "driver_specific": {} 00:10:40.724 }' 00:10:40.724 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.724 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.724 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:40.724 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:40.984 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:41.243 [2024-06-10 10:06:02.984834] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:41.243 [2024-06-10 10:06:02.984854] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:41.243 [2024-06-10 10:06:02.984884] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:41.243 10:06:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:41.243 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:41.244 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:41.244 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.244 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:41.503 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.503 "name": "Existed_Raid", 00:10:41.503 "uuid": "6dbb3193-bef6-4191-99ab-772df294df4d", 00:10:41.503 "strip_size_kb": 64, 00:10:41.503 "state": "offline", 00:10:41.503 "raid_level": "raid0", 00:10:41.503 "superblock": true, 00:10:41.503 "num_base_bdevs": 2, 00:10:41.503 "num_base_bdevs_discovered": 1, 00:10:41.503 "num_base_bdevs_operational": 1, 00:10:41.503 "base_bdevs_list": [ 00:10:41.503 { 00:10:41.503 "name": null, 00:10:41.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.503 "is_configured": false, 00:10:41.503 "data_offset": 2048, 00:10:41.503 "data_size": 63488 00:10:41.503 }, 00:10:41.503 { 00:10:41.503 "name": "BaseBdev2", 00:10:41.503 "uuid": "ee93a898-4ada-498a-911a-2deb3bdd3243", 00:10:41.503 "is_configured": true, 00:10:41.503 "data_offset": 2048, 00:10:41.503 "data_size": 63488 00:10:41.503 } 00:10:41.503 ] 00:10:41.503 }' 00:10:41.503 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.503 10:06:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:42.073 10:06:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:42.332 [2024-06-10 10:06:04.079606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:42.332 [2024-06-10 10:06:04.079640] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11cb1c0 name Existed_Raid, state offline 00:10:42.332 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:42.332 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:42.332 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.332 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 957647 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 957647 ']' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 957647 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 957647 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 957647' 00:10:42.593 killing process with pid 957647 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 957647 00:10:42.593 [2024-06-10 10:06:04.341772] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:42.593 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 957647 00:10:42.593 [2024-06-10 10:06:04.342369] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:42.854 10:06:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:42.854 00:10:42.854 real 0m8.728s 00:10:42.854 user 0m15.876s 00:10:42.854 sys 0m1.333s 00:10:42.854 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:42.854 10:06:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:42.854 ************************************ 00:10:42.854 END TEST raid_state_function_test_sb 00:10:42.854 ************************************ 00:10:42.854 10:06:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:42.854 10:06:04 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:10:42.854 10:06:04 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:42.854 10:06:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:42.854 ************************************ 00:10:42.854 START TEST raid_superblock_test 00:10:42.854 ************************************ 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 2 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=959459 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 959459 /var/tmp/spdk-raid.sock 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 959459 ']' 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:42.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:42.854 10:06:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:42.854 [2024-06-10 10:06:04.589769] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:42.854 [2024-06-10 10:06:04.589817] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid959459 ] 00:10:42.854 [2024-06-10 10:06:04.681830] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.114 [2024-06-10 10:06:04.756782] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:43.114 [2024-06-10 10:06:04.806313] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:43.114 [2024-06-10 10:06:04.806338] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:43.683 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:43.943 malloc1 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:43.943 [2024-06-10 10:06:05.745887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:43.943 [2024-06-10 10:06:05.745925] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:43.943 [2024-06-10 10:06:05.745935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fd990 00:10:43.943 [2024-06-10 10:06:05.745942] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:43.943 [2024-06-10 10:06:05.747235] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:43.943 [2024-06-10 10:06:05.747253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:43.943 pt1 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:43.943 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:44.203 malloc2 00:10:44.203 10:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:44.203 [2024-06-10 10:06:06.040490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:44.203 [2024-06-10 10:06:06.040518] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:44.203 [2024-06-10 10:06:06.040528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fe4e0 00:10:44.203 [2024-06-10 10:06:06.040534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:44.203 [2024-06-10 10:06:06.041708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:44.203 [2024-06-10 10:06:06.041727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:44.203 pt2 00:10:44.203 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:44.203 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:44.203 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:44.463 [2024-06-10 10:06:06.184877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:44.463 [2024-06-10 10:06:06.185849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:44.463 [2024-06-10 10:06:06.185955] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aa6bc0 00:10:44.463 [2024-06-10 10:06:06.185963] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:44.463 [2024-06-10 10:06:06.186103] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aabae0 00:10:44.463 [2024-06-10 10:06:06.186204] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aa6bc0 00:10:44.463 [2024-06-10 10:06:06.186210] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1aa6bc0 00:10:44.463 [2024-06-10 10:06:06.186275] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.463 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:44.723 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:44.723 "name": "raid_bdev1", 00:10:44.723 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:44.723 "strip_size_kb": 64, 00:10:44.723 "state": "online", 00:10:44.723 "raid_level": "raid0", 00:10:44.723 "superblock": true, 00:10:44.723 "num_base_bdevs": 2, 00:10:44.723 "num_base_bdevs_discovered": 2, 00:10:44.723 "num_base_bdevs_operational": 2, 00:10:44.723 "base_bdevs_list": [ 00:10:44.723 { 00:10:44.723 "name": "pt1", 00:10:44.723 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:44.723 "is_configured": true, 00:10:44.723 "data_offset": 2048, 00:10:44.723 "data_size": 63488 00:10:44.723 }, 00:10:44.723 { 00:10:44.723 "name": "pt2", 00:10:44.723 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:44.723 "is_configured": true, 00:10:44.723 "data_offset": 2048, 00:10:44.723 "data_size": 63488 00:10:44.723 } 00:10:44.723 ] 00:10:44.723 }' 00:10:44.723 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:44.723 10:06:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:45.294 10:06:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:45.294 [2024-06-10 10:06:07.119389] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:45.294 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:45.294 "name": "raid_bdev1", 00:10:45.294 "aliases": [ 00:10:45.294 "5e7e1831-01b4-4aba-af7a-2e287c6e489b" 00:10:45.294 ], 00:10:45.294 "product_name": "Raid Volume", 00:10:45.294 "block_size": 512, 00:10:45.294 "num_blocks": 126976, 00:10:45.294 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:45.294 "assigned_rate_limits": { 00:10:45.294 "rw_ios_per_sec": 0, 00:10:45.294 "rw_mbytes_per_sec": 0, 00:10:45.294 "r_mbytes_per_sec": 0, 00:10:45.294 "w_mbytes_per_sec": 0 00:10:45.294 }, 00:10:45.294 "claimed": false, 00:10:45.294 "zoned": false, 00:10:45.294 "supported_io_types": { 00:10:45.294 "read": true, 00:10:45.294 "write": true, 00:10:45.294 "unmap": true, 00:10:45.294 "write_zeroes": true, 00:10:45.294 "flush": true, 00:10:45.294 "reset": true, 00:10:45.294 "compare": false, 00:10:45.294 "compare_and_write": false, 00:10:45.294 "abort": false, 00:10:45.294 "nvme_admin": false, 00:10:45.294 "nvme_io": false 00:10:45.294 }, 00:10:45.294 "memory_domains": [ 00:10:45.294 { 00:10:45.294 "dma_device_id": "system", 00:10:45.294 "dma_device_type": 1 00:10:45.294 }, 00:10:45.294 { 00:10:45.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.294 "dma_device_type": 2 00:10:45.294 }, 00:10:45.294 { 00:10:45.294 "dma_device_id": "system", 00:10:45.294 "dma_device_type": 1 00:10:45.294 }, 00:10:45.294 { 00:10:45.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.294 "dma_device_type": 2 00:10:45.294 } 00:10:45.294 ], 00:10:45.294 "driver_specific": { 00:10:45.294 "raid": { 00:10:45.294 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:45.294 "strip_size_kb": 64, 00:10:45.294 "state": "online", 00:10:45.294 "raid_level": "raid0", 00:10:45.294 "superblock": true, 00:10:45.294 "num_base_bdevs": 2, 00:10:45.294 "num_base_bdevs_discovered": 2, 00:10:45.294 "num_base_bdevs_operational": 2, 00:10:45.294 "base_bdevs_list": [ 00:10:45.294 { 00:10:45.294 "name": "pt1", 00:10:45.294 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:45.294 "is_configured": true, 00:10:45.294 "data_offset": 2048, 00:10:45.294 "data_size": 63488 00:10:45.294 }, 00:10:45.294 { 00:10:45.294 "name": "pt2", 00:10:45.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:45.294 "is_configured": true, 00:10:45.294 "data_offset": 2048, 00:10:45.294 "data_size": 63488 00:10:45.294 } 00:10:45.294 ] 00:10:45.294 } 00:10:45.294 } 00:10:45.294 }' 00:10:45.294 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:45.553 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:45.553 pt2' 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:45.554 "name": "pt1", 00:10:45.554 "aliases": [ 00:10:45.554 "00000000-0000-0000-0000-000000000001" 00:10:45.554 ], 00:10:45.554 "product_name": "passthru", 00:10:45.554 "block_size": 512, 00:10:45.554 "num_blocks": 65536, 00:10:45.554 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:45.554 "assigned_rate_limits": { 00:10:45.554 "rw_ios_per_sec": 0, 00:10:45.554 "rw_mbytes_per_sec": 0, 00:10:45.554 "r_mbytes_per_sec": 0, 00:10:45.554 "w_mbytes_per_sec": 0 00:10:45.554 }, 00:10:45.554 "claimed": true, 00:10:45.554 "claim_type": "exclusive_write", 00:10:45.554 "zoned": false, 00:10:45.554 "supported_io_types": { 00:10:45.554 "read": true, 00:10:45.554 "write": true, 00:10:45.554 "unmap": true, 00:10:45.554 "write_zeroes": true, 00:10:45.554 "flush": true, 00:10:45.554 "reset": true, 00:10:45.554 "compare": false, 00:10:45.554 "compare_and_write": false, 00:10:45.554 "abort": true, 00:10:45.554 "nvme_admin": false, 00:10:45.554 "nvme_io": false 00:10:45.554 }, 00:10:45.554 "memory_domains": [ 00:10:45.554 { 00:10:45.554 "dma_device_id": "system", 00:10:45.554 "dma_device_type": 1 00:10:45.554 }, 00:10:45.554 { 00:10:45.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.554 "dma_device_type": 2 00:10:45.554 } 00:10:45.554 ], 00:10:45.554 "driver_specific": { 00:10:45.554 "passthru": { 00:10:45.554 "name": "pt1", 00:10:45.554 "base_bdev_name": "malloc1" 00:10:45.554 } 00:10:45.554 } 00:10:45.554 }' 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:45.554 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:45.814 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:46.074 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:46.074 "name": "pt2", 00:10:46.074 "aliases": [ 00:10:46.074 "00000000-0000-0000-0000-000000000002" 00:10:46.074 ], 00:10:46.074 "product_name": "passthru", 00:10:46.074 "block_size": 512, 00:10:46.074 "num_blocks": 65536, 00:10:46.074 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:46.074 "assigned_rate_limits": { 00:10:46.074 "rw_ios_per_sec": 0, 00:10:46.074 "rw_mbytes_per_sec": 0, 00:10:46.074 "r_mbytes_per_sec": 0, 00:10:46.074 "w_mbytes_per_sec": 0 00:10:46.074 }, 00:10:46.074 "claimed": true, 00:10:46.074 "claim_type": "exclusive_write", 00:10:46.074 "zoned": false, 00:10:46.074 "supported_io_types": { 00:10:46.074 "read": true, 00:10:46.074 "write": true, 00:10:46.074 "unmap": true, 00:10:46.074 "write_zeroes": true, 00:10:46.074 "flush": true, 00:10:46.074 "reset": true, 00:10:46.074 "compare": false, 00:10:46.074 "compare_and_write": false, 00:10:46.074 "abort": true, 00:10:46.074 "nvme_admin": false, 00:10:46.074 "nvme_io": false 00:10:46.074 }, 00:10:46.074 "memory_domains": [ 00:10:46.074 { 00:10:46.074 "dma_device_id": "system", 00:10:46.074 "dma_device_type": 1 00:10:46.074 }, 00:10:46.074 { 00:10:46.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.074 "dma_device_type": 2 00:10:46.074 } 00:10:46.074 ], 00:10:46.074 "driver_specific": { 00:10:46.074 "passthru": { 00:10:46.074 "name": "pt2", 00:10:46.074 "base_bdev_name": "malloc2" 00:10:46.074 } 00:10:46.074 } 00:10:46.074 }' 00:10:46.074 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:46.074 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:46.334 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:46.334 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:46.334 10:06:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:46.334 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:46.594 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:46.594 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:46.594 [2024-06-10 10:06:08.382594] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:46.594 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5e7e1831-01b4-4aba-af7a-2e287c6e489b 00:10:46.594 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 5e7e1831-01b4-4aba-af7a-2e287c6e489b ']' 00:10:46.594 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:46.854 [2024-06-10 10:06:08.574918] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:46.854 [2024-06-10 10:06:08.574928] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:46.854 [2024-06-10 10:06:08.574964] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.854 [2024-06-10 10:06:08.574996] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:46.854 [2024-06-10 10:06:08.575001] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa6bc0 name raid_bdev1, state offline 00:10:46.854 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.854 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:47.114 10:06:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:47.374 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:47.374 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:47.635 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:47.895 [2024-06-10 10:06:09.521271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:47.895 [2024-06-10 10:06:09.522341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:47.895 [2024-06-10 10:06:09.522384] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:47.895 [2024-06-10 10:06:09.522412] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:47.895 [2024-06-10 10:06:09.522422] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:47.895 [2024-06-10 10:06:09.522427] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aab860 name raid_bdev1, state configuring 00:10:47.895 request: 00:10:47.895 { 00:10:47.895 "name": "raid_bdev1", 00:10:47.895 "raid_level": "raid0", 00:10:47.895 "base_bdevs": [ 00:10:47.895 "malloc1", 00:10:47.895 "malloc2" 00:10:47.895 ], 00:10:47.895 "superblock": false, 00:10:47.895 "strip_size_kb": 64, 00:10:47.895 "method": "bdev_raid_create", 00:10:47.895 "req_id": 1 00:10:47.895 } 00:10:47.895 Got JSON-RPC error response 00:10:47.895 response: 00:10:47.895 { 00:10:47.895 "code": -17, 00:10:47.895 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:47.895 } 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:47.895 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:48.156 [2024-06-10 10:06:09.906205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:48.156 [2024-06-10 10:06:09.906234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.156 [2024-06-10 10:06:09.906247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa6900 00:10:48.156 [2024-06-10 10:06:09.906253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.156 [2024-06-10 10:06:09.907574] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.156 [2024-06-10 10:06:09.907594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:48.156 [2024-06-10 10:06:09.907641] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:48.156 [2024-06-10 10:06:09.907658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:48.156 pt1 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.156 10:06:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:48.416 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.416 "name": "raid_bdev1", 00:10:48.416 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:48.416 "strip_size_kb": 64, 00:10:48.416 "state": "configuring", 00:10:48.416 "raid_level": "raid0", 00:10:48.416 "superblock": true, 00:10:48.416 "num_base_bdevs": 2, 00:10:48.416 "num_base_bdevs_discovered": 1, 00:10:48.416 "num_base_bdevs_operational": 2, 00:10:48.416 "base_bdevs_list": [ 00:10:48.416 { 00:10:48.416 "name": "pt1", 00:10:48.416 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:48.416 "is_configured": true, 00:10:48.416 "data_offset": 2048, 00:10:48.417 "data_size": 63488 00:10:48.417 }, 00:10:48.417 { 00:10:48.417 "name": null, 00:10:48.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:48.417 "is_configured": false, 00:10:48.417 "data_offset": 2048, 00:10:48.417 "data_size": 63488 00:10:48.417 } 00:10:48.417 ] 00:10:48.417 }' 00:10:48.417 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.417 10:06:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:48.987 [2024-06-10 10:06:10.820629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:48.987 [2024-06-10 10:06:10.820669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.987 [2024-06-10 10:06:10.820680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aac1b0 00:10:48.987 [2024-06-10 10:06:10.820686] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.987 [2024-06-10 10:06:10.820974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.987 [2024-06-10 10:06:10.820985] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:48.987 [2024-06-10 10:06:10.821028] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:48.987 [2024-06-10 10:06:10.821040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:48.987 [2024-06-10 10:06:10.821111] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aaeaa0 00:10:48.987 [2024-06-10 10:06:10.821117] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:48.987 [2024-06-10 10:06:10.821248] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aa6b90 00:10:48.987 [2024-06-10 10:06:10.821341] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aaeaa0 00:10:48.987 [2024-06-10 10:06:10.821346] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1aaeaa0 00:10:48.987 [2024-06-10 10:06:10.821423] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.987 pt2 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.987 10:06:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:49.247 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.247 "name": "raid_bdev1", 00:10:49.247 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:49.247 "strip_size_kb": 64, 00:10:49.247 "state": "online", 00:10:49.247 "raid_level": "raid0", 00:10:49.247 "superblock": true, 00:10:49.247 "num_base_bdevs": 2, 00:10:49.247 "num_base_bdevs_discovered": 2, 00:10:49.247 "num_base_bdevs_operational": 2, 00:10:49.247 "base_bdevs_list": [ 00:10:49.247 { 00:10:49.247 "name": "pt1", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:49.247 "is_configured": true, 00:10:49.247 "data_offset": 2048, 00:10:49.247 "data_size": 63488 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "name": "pt2", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:49.247 "is_configured": true, 00:10:49.247 "data_offset": 2048, 00:10:49.247 "data_size": 63488 00:10:49.247 } 00:10:49.247 ] 00:10:49.247 }' 00:10:49.247 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.247 10:06:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:49.818 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:50.079 [2024-06-10 10:06:11.747175] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:50.079 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:50.079 "name": "raid_bdev1", 00:10:50.079 "aliases": [ 00:10:50.079 "5e7e1831-01b4-4aba-af7a-2e287c6e489b" 00:10:50.079 ], 00:10:50.079 "product_name": "Raid Volume", 00:10:50.079 "block_size": 512, 00:10:50.079 "num_blocks": 126976, 00:10:50.079 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:50.079 "assigned_rate_limits": { 00:10:50.079 "rw_ios_per_sec": 0, 00:10:50.079 "rw_mbytes_per_sec": 0, 00:10:50.079 "r_mbytes_per_sec": 0, 00:10:50.079 "w_mbytes_per_sec": 0 00:10:50.079 }, 00:10:50.079 "claimed": false, 00:10:50.079 "zoned": false, 00:10:50.079 "supported_io_types": { 00:10:50.080 "read": true, 00:10:50.080 "write": true, 00:10:50.080 "unmap": true, 00:10:50.080 "write_zeroes": true, 00:10:50.080 "flush": true, 00:10:50.080 "reset": true, 00:10:50.080 "compare": false, 00:10:50.080 "compare_and_write": false, 00:10:50.080 "abort": false, 00:10:50.080 "nvme_admin": false, 00:10:50.080 "nvme_io": false 00:10:50.080 }, 00:10:50.080 "memory_domains": [ 00:10:50.080 { 00:10:50.080 "dma_device_id": "system", 00:10:50.080 "dma_device_type": 1 00:10:50.080 }, 00:10:50.080 { 00:10:50.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.080 "dma_device_type": 2 00:10:50.080 }, 00:10:50.080 { 00:10:50.080 "dma_device_id": "system", 00:10:50.080 "dma_device_type": 1 00:10:50.080 }, 00:10:50.080 { 00:10:50.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.080 "dma_device_type": 2 00:10:50.080 } 00:10:50.080 ], 00:10:50.080 "driver_specific": { 00:10:50.080 "raid": { 00:10:50.080 "uuid": "5e7e1831-01b4-4aba-af7a-2e287c6e489b", 00:10:50.080 "strip_size_kb": 64, 00:10:50.080 "state": "online", 00:10:50.080 "raid_level": "raid0", 00:10:50.080 "superblock": true, 00:10:50.080 "num_base_bdevs": 2, 00:10:50.080 "num_base_bdevs_discovered": 2, 00:10:50.080 "num_base_bdevs_operational": 2, 00:10:50.080 "base_bdevs_list": [ 00:10:50.080 { 00:10:50.080 "name": "pt1", 00:10:50.080 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:50.080 "is_configured": true, 00:10:50.080 "data_offset": 2048, 00:10:50.080 "data_size": 63488 00:10:50.080 }, 00:10:50.080 { 00:10:50.080 "name": "pt2", 00:10:50.080 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:50.080 "is_configured": true, 00:10:50.080 "data_offset": 2048, 00:10:50.080 "data_size": 63488 00:10:50.080 } 00:10:50.080 ] 00:10:50.080 } 00:10:50.080 } 00:10:50.080 }' 00:10:50.080 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:50.080 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:50.080 pt2' 00:10:50.080 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.080 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:50.080 10:06:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.340 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.340 "name": "pt1", 00:10:50.340 "aliases": [ 00:10:50.340 "00000000-0000-0000-0000-000000000001" 00:10:50.340 ], 00:10:50.340 "product_name": "passthru", 00:10:50.340 "block_size": 512, 00:10:50.340 "num_blocks": 65536, 00:10:50.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:50.340 "assigned_rate_limits": { 00:10:50.340 "rw_ios_per_sec": 0, 00:10:50.340 "rw_mbytes_per_sec": 0, 00:10:50.340 "r_mbytes_per_sec": 0, 00:10:50.340 "w_mbytes_per_sec": 0 00:10:50.340 }, 00:10:50.340 "claimed": true, 00:10:50.340 "claim_type": "exclusive_write", 00:10:50.340 "zoned": false, 00:10:50.340 "supported_io_types": { 00:10:50.340 "read": true, 00:10:50.340 "write": true, 00:10:50.340 "unmap": true, 00:10:50.340 "write_zeroes": true, 00:10:50.340 "flush": true, 00:10:50.340 "reset": true, 00:10:50.340 "compare": false, 00:10:50.340 "compare_and_write": false, 00:10:50.340 "abort": true, 00:10:50.340 "nvme_admin": false, 00:10:50.340 "nvme_io": false 00:10:50.340 }, 00:10:50.340 "memory_domains": [ 00:10:50.340 { 00:10:50.340 "dma_device_id": "system", 00:10:50.340 "dma_device_type": 1 00:10:50.340 }, 00:10:50.340 { 00:10:50.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.340 "dma_device_type": 2 00:10:50.340 } 00:10:50.340 ], 00:10:50.340 "driver_specific": { 00:10:50.340 "passthru": { 00:10:50.341 "name": "pt1", 00:10:50.341 "base_bdev_name": "malloc1" 00:10:50.341 } 00:10:50.341 } 00:10:50.341 }' 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.341 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.600 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.860 "name": "pt2", 00:10:50.860 "aliases": [ 00:10:50.860 "00000000-0000-0000-0000-000000000002" 00:10:50.860 ], 00:10:50.860 "product_name": "passthru", 00:10:50.860 "block_size": 512, 00:10:50.860 "num_blocks": 65536, 00:10:50.860 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:50.860 "assigned_rate_limits": { 00:10:50.860 "rw_ios_per_sec": 0, 00:10:50.860 "rw_mbytes_per_sec": 0, 00:10:50.860 "r_mbytes_per_sec": 0, 00:10:50.860 "w_mbytes_per_sec": 0 00:10:50.860 }, 00:10:50.860 "claimed": true, 00:10:50.860 "claim_type": "exclusive_write", 00:10:50.860 "zoned": false, 00:10:50.860 "supported_io_types": { 00:10:50.860 "read": true, 00:10:50.860 "write": true, 00:10:50.860 "unmap": true, 00:10:50.860 "write_zeroes": true, 00:10:50.860 "flush": true, 00:10:50.860 "reset": true, 00:10:50.860 "compare": false, 00:10:50.860 "compare_and_write": false, 00:10:50.860 "abort": true, 00:10:50.860 "nvme_admin": false, 00:10:50.860 "nvme_io": false 00:10:50.860 }, 00:10:50.860 "memory_domains": [ 00:10:50.860 { 00:10:50.860 "dma_device_id": "system", 00:10:50.860 "dma_device_type": 1 00:10:50.860 }, 00:10:50.860 { 00:10:50.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.860 "dma_device_type": 2 00:10:50.860 } 00:10:50.860 ], 00:10:50.860 "driver_specific": { 00:10:50.860 "passthru": { 00:10:50.860 "name": "pt2", 00:10:50.860 "base_bdev_name": "malloc2" 00:10:50.860 } 00:10:50.860 } 00:10:50.860 }' 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.860 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:51.121 10:06:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:51.382 [2024-06-10 10:06:13.070500] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 5e7e1831-01b4-4aba-af7a-2e287c6e489b '!=' 5e7e1831-01b4-4aba-af7a-2e287c6e489b ']' 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 959459 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 959459 ']' 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 959459 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 959459 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 959459' 00:10:51.382 killing process with pid 959459 00:10:51.382 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 959459 00:10:51.382 [2024-06-10 10:06:13.142902] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:51.383 [2024-06-10 10:06:13.142942] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:51.383 [2024-06-10 10:06:13.142973] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:51.383 [2024-06-10 10:06:13.142979] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aaeaa0 name raid_bdev1, state offline 00:10:51.383 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 959459 00:10:51.383 [2024-06-10 10:06:13.152085] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:51.644 10:06:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:51.644 00:10:51.644 real 0m8.735s 00:10:51.644 user 0m15.905s 00:10:51.644 sys 0m1.323s 00:10:51.644 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:51.644 10:06:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.644 ************************************ 00:10:51.644 END TEST raid_superblock_test 00:10:51.644 ************************************ 00:10:51.644 10:06:13 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:10:51.644 10:06:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:51.644 10:06:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:51.644 10:06:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:51.644 ************************************ 00:10:51.644 START TEST raid_read_error_test 00:10:51.644 ************************************ 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 read 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.l4o3t2Bj2h 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=961555 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 961555 /var/tmp/spdk-raid.sock 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 961555 ']' 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:51.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:51.644 10:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.644 [2024-06-10 10:06:13.410077] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:51.644 [2024-06-10 10:06:13.410131] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid961555 ] 00:10:51.644 [2024-06-10 10:06:13.502104] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:51.904 [2024-06-10 10:06:13.579573] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:51.904 [2024-06-10 10:06:13.629725] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:51.904 [2024-06-10 10:06:13.629753] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:52.474 10:06:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:52.474 10:06:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:52.474 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:52.474 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:52.733 BaseBdev1_malloc 00:10:52.733 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:52.993 true 00:10:52.993 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:52.993 [2024-06-10 10:06:14.780071] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:52.993 [2024-06-10 10:06:14.780106] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.993 [2024-06-10 10:06:14.780116] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x178ed10 00:10:52.993 [2024-06-10 10:06:14.780123] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.993 [2024-06-10 10:06:14.781435] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.993 [2024-06-10 10:06:14.781454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:52.993 BaseBdev1 00:10:52.993 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:52.993 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:53.253 BaseBdev2_malloc 00:10:53.253 10:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:53.514 true 00:10:53.514 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:53.514 [2024-06-10 10:06:15.343017] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:53.514 [2024-06-10 10:06:15.343048] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:53.514 [2024-06-10 10:06:15.343058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1793710 00:10:53.514 [2024-06-10 10:06:15.343064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:53.514 [2024-06-10 10:06:15.344209] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:53.514 [2024-06-10 10:06:15.344228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:53.514 BaseBdev2 00:10:53.514 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:53.774 [2024-06-10 10:06:15.535532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:53.774 [2024-06-10 10:06:15.536519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:53.774 [2024-06-10 10:06:15.536658] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17949f0 00:10:53.774 [2024-06-10 10:06:15.536666] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:53.774 [2024-06-10 10:06:15.536805] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1794cd0 00:10:53.774 [2024-06-10 10:06:15.536924] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17949f0 00:10:53.774 [2024-06-10 10:06:15.536930] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17949f0 00:10:53.774 [2024-06-10 10:06:15.537003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.774 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:54.035 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.035 "name": "raid_bdev1", 00:10:54.035 "uuid": "d03b1395-7152-405a-8a68-08d220b035f5", 00:10:54.035 "strip_size_kb": 64, 00:10:54.035 "state": "online", 00:10:54.035 "raid_level": "raid0", 00:10:54.035 "superblock": true, 00:10:54.035 "num_base_bdevs": 2, 00:10:54.035 "num_base_bdevs_discovered": 2, 00:10:54.035 "num_base_bdevs_operational": 2, 00:10:54.035 "base_bdevs_list": [ 00:10:54.035 { 00:10:54.035 "name": "BaseBdev1", 00:10:54.035 "uuid": "7a505f01-a2e4-5039-afd1-e1e8dec33937", 00:10:54.035 "is_configured": true, 00:10:54.035 "data_offset": 2048, 00:10:54.035 "data_size": 63488 00:10:54.035 }, 00:10:54.035 { 00:10:54.035 "name": "BaseBdev2", 00:10:54.035 "uuid": "91931077-cfc6-5417-8bf7-29278463d2a2", 00:10:54.035 "is_configured": true, 00:10:54.035 "data_offset": 2048, 00:10:54.035 "data_size": 63488 00:10:54.035 } 00:10:54.035 ] 00:10:54.035 }' 00:10:54.035 10:06:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.035 10:06:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.607 10:06:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:54.607 10:06:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:54.607 [2024-06-10 10:06:16.369810] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e47e0 00:10:55.549 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.810 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:56.071 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.071 "name": "raid_bdev1", 00:10:56.071 "uuid": "d03b1395-7152-405a-8a68-08d220b035f5", 00:10:56.071 "strip_size_kb": 64, 00:10:56.071 "state": "online", 00:10:56.071 "raid_level": "raid0", 00:10:56.071 "superblock": true, 00:10:56.071 "num_base_bdevs": 2, 00:10:56.071 "num_base_bdevs_discovered": 2, 00:10:56.071 "num_base_bdevs_operational": 2, 00:10:56.071 "base_bdevs_list": [ 00:10:56.071 { 00:10:56.071 "name": "BaseBdev1", 00:10:56.071 "uuid": "7a505f01-a2e4-5039-afd1-e1e8dec33937", 00:10:56.071 "is_configured": true, 00:10:56.071 "data_offset": 2048, 00:10:56.071 "data_size": 63488 00:10:56.071 }, 00:10:56.071 { 00:10:56.071 "name": "BaseBdev2", 00:10:56.071 "uuid": "91931077-cfc6-5417-8bf7-29278463d2a2", 00:10:56.071 "is_configured": true, 00:10:56.071 "data_offset": 2048, 00:10:56.071 "data_size": 63488 00:10:56.071 } 00:10:56.071 ] 00:10:56.071 }' 00:10:56.071 10:06:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.071 10:06:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:56.643 [2024-06-10 10:06:18.401294] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:56.643 [2024-06-10 10:06:18.401328] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.643 [2024-06-10 10:06:18.403923] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.643 [2024-06-10 10:06:18.403947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:56.643 [2024-06-10 10:06:18.403965] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:56.643 [2024-06-10 10:06:18.403971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17949f0 name raid_bdev1, state offline 00:10:56.643 0 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 961555 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 961555 ']' 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 961555 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 961555 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 961555' 00:10:56.643 killing process with pid 961555 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 961555 00:10:56.643 [2024-06-10 10:06:18.467851] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:56.643 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 961555 00:10:56.643 [2024-06-10 10:06:18.473756] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.l4o3t2Bj2h 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:10:56.905 00:10:56.905 real 0m5.266s 00:10:56.905 user 0m8.226s 00:10:56.905 sys 0m0.758s 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:56.905 10:06:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.905 ************************************ 00:10:56.905 END TEST raid_read_error_test 00:10:56.905 ************************************ 00:10:56.905 10:06:18 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:10:56.905 10:06:18 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:56.905 10:06:18 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:56.905 10:06:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:56.905 ************************************ 00:10:56.905 START TEST raid_write_error_test 00:10:56.905 ************************************ 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 write 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3WzIdmrbBj 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=962570 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 962570 /var/tmp/spdk-raid.sock 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 962570 ']' 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:56.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:56.905 10:06:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.905 [2024-06-10 10:06:18.747574] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:10:56.905 [2024-06-10 10:06:18.747630] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid962570 ] 00:10:57.166 [2024-06-10 10:06:18.839870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:57.166 [2024-06-10 10:06:18.908073] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:57.166 [2024-06-10 10:06:18.949792] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:57.166 [2024-06-10 10:06:18.949815] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:57.738 10:06:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:57.738 10:06:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:57.738 10:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:57.738 10:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:57.998 BaseBdev1_malloc 00:10:57.998 10:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:58.258 true 00:10:58.258 10:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:58.518 [2024-06-10 10:06:20.128133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:58.518 [2024-06-10 10:06:20.128171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.518 [2024-06-10 10:06:20.128183] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2097d10 00:10:58.518 [2024-06-10 10:06:20.128189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.518 [2024-06-10 10:06:20.129582] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.518 [2024-06-10 10:06:20.129607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:58.518 BaseBdev1 00:10:58.518 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:58.518 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:58.518 BaseBdev2_malloc 00:10:58.518 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:58.778 true 00:10:58.778 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:59.039 [2024-06-10 10:06:20.679220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:59.039 [2024-06-10 10:06:20.679253] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.039 [2024-06-10 10:06:20.679266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x209c710 00:10:59.039 [2024-06-10 10:06:20.679272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.039 [2024-06-10 10:06:20.680446] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.039 [2024-06-10 10:06:20.680466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:59.039 BaseBdev2 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:59.039 [2024-06-10 10:06:20.871727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:59.039 [2024-06-10 10:06:20.872710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:59.039 [2024-06-10 10:06:20.872852] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x209d9f0 00:10:59.039 [2024-06-10 10:06:20.872860] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:59.039 [2024-06-10 10:06:20.872999] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209dcd0 00:10:59.039 [2024-06-10 10:06:20.873106] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x209d9f0 00:10:59.039 [2024-06-10 10:06:20.873112] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x209d9f0 00:10:59.039 [2024-06-10 10:06:20.873185] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.039 10:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:59.300 10:06:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.300 "name": "raid_bdev1", 00:10:59.300 "uuid": "f9c8595b-f9a3-4a74-81a0-fbe176cbe8e9", 00:10:59.300 "strip_size_kb": 64, 00:10:59.300 "state": "online", 00:10:59.300 "raid_level": "raid0", 00:10:59.300 "superblock": true, 00:10:59.300 "num_base_bdevs": 2, 00:10:59.300 "num_base_bdevs_discovered": 2, 00:10:59.300 "num_base_bdevs_operational": 2, 00:10:59.300 "base_bdevs_list": [ 00:10:59.300 { 00:10:59.300 "name": "BaseBdev1", 00:10:59.300 "uuid": "631b48c4-175a-53ac-ba29-2f3c35115656", 00:10:59.300 "is_configured": true, 00:10:59.300 "data_offset": 2048, 00:10:59.300 "data_size": 63488 00:10:59.300 }, 00:10:59.300 { 00:10:59.300 "name": "BaseBdev2", 00:10:59.300 "uuid": "17a37a17-89c3-5aa1-bf54-c0f127fc34d5", 00:10:59.300 "is_configured": true, 00:10:59.300 "data_offset": 2048, 00:10:59.300 "data_size": 63488 00:10:59.300 } 00:10:59.300 ] 00:10:59.300 }' 00:10:59.300 10:06:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.300 10:06:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.953 10:06:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:59.953 10:06:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:59.953 [2024-06-10 10:06:21.685962] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eed7e0 00:11:00.895 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.156 "name": "raid_bdev1", 00:11:01.156 "uuid": "f9c8595b-f9a3-4a74-81a0-fbe176cbe8e9", 00:11:01.156 "strip_size_kb": 64, 00:11:01.156 "state": "online", 00:11:01.156 "raid_level": "raid0", 00:11:01.156 "superblock": true, 00:11:01.156 "num_base_bdevs": 2, 00:11:01.156 "num_base_bdevs_discovered": 2, 00:11:01.156 "num_base_bdevs_operational": 2, 00:11:01.156 "base_bdevs_list": [ 00:11:01.156 { 00:11:01.156 "name": "BaseBdev1", 00:11:01.156 "uuid": "631b48c4-175a-53ac-ba29-2f3c35115656", 00:11:01.156 "is_configured": true, 00:11:01.156 "data_offset": 2048, 00:11:01.156 "data_size": 63488 00:11:01.156 }, 00:11:01.156 { 00:11:01.156 "name": "BaseBdev2", 00:11:01.156 "uuid": "17a37a17-89c3-5aa1-bf54-c0f127fc34d5", 00:11:01.156 "is_configured": true, 00:11:01.156 "data_offset": 2048, 00:11:01.156 "data_size": 63488 00:11:01.156 } 00:11:01.156 ] 00:11:01.156 }' 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.156 10:06:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.727 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:01.988 [2024-06-10 10:06:23.698176] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:01.988 [2024-06-10 10:06:23.698210] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:01.988 [2024-06-10 10:06:23.700797] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:01.988 [2024-06-10 10:06:23.700820] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:01.988 [2024-06-10 10:06:23.700843] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:01.988 [2024-06-10 10:06:23.700849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x209d9f0 name raid_bdev1, state offline 00:11:01.988 0 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 962570 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 962570 ']' 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 962570 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 962570 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 962570' 00:11:01.988 killing process with pid 962570 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 962570 00:11:01.988 [2024-06-10 10:06:23.774685] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:01.988 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 962570 00:11:01.988 [2024-06-10 10:06:23.780609] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3WzIdmrbBj 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:02.250 00:11:02.250 real 0m5.231s 00:11:02.250 user 0m8.190s 00:11:02.250 sys 0m0.743s 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:02.250 10:06:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.250 ************************************ 00:11:02.250 END TEST raid_write_error_test 00:11:02.250 ************************************ 00:11:02.250 10:06:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:02.250 10:06:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:02.250 10:06:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:02.250 10:06:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:02.250 10:06:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:02.250 ************************************ 00:11:02.250 START TEST raid_state_function_test 00:11:02.250 ************************************ 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 false 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=963570 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 963570' 00:11:02.250 Process raid pid: 963570 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 963570 /var/tmp/spdk-raid.sock 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 963570 ']' 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:02.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:02.250 10:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.250 [2024-06-10 10:06:24.042453] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:02.250 [2024-06-10 10:06:24.042496] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:02.511 [2024-06-10 10:06:24.130179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:02.511 [2024-06-10 10:06:24.192096] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:02.511 [2024-06-10 10:06:24.234426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.511 [2024-06-10 10:06:24.234450] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:03.083 10:06:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:03.083 10:06:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:11:03.083 10:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:03.344 [2024-06-10 10:06:25.049274] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:03.344 [2024-06-10 10:06:25.049306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:03.344 [2024-06-10 10:06:25.049313] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:03.344 [2024-06-10 10:06:25.049319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.344 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.605 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.605 "name": "Existed_Raid", 00:11:03.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.605 "strip_size_kb": 64, 00:11:03.605 "state": "configuring", 00:11:03.605 "raid_level": "concat", 00:11:03.605 "superblock": false, 00:11:03.605 "num_base_bdevs": 2, 00:11:03.605 "num_base_bdevs_discovered": 0, 00:11:03.605 "num_base_bdevs_operational": 2, 00:11:03.605 "base_bdevs_list": [ 00:11:03.605 { 00:11:03.605 "name": "BaseBdev1", 00:11:03.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.605 "is_configured": false, 00:11:03.605 "data_offset": 0, 00:11:03.605 "data_size": 0 00:11:03.605 }, 00:11:03.605 { 00:11:03.605 "name": "BaseBdev2", 00:11:03.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.605 "is_configured": false, 00:11:03.605 "data_offset": 0, 00:11:03.605 "data_size": 0 00:11:03.605 } 00:11:03.605 ] 00:11:03.605 }' 00:11:03.605 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.605 10:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.177 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:04.177 [2024-06-10 10:06:25.955462] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:04.177 [2024-06-10 10:06:25.955476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1865b00 name Existed_Raid, state configuring 00:11:04.177 10:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:04.438 [2024-06-10 10:06:26.147961] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:04.438 [2024-06-10 10:06:26.147978] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:04.438 [2024-06-10 10:06:26.147982] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:04.438 [2024-06-10 10:06:26.147988] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:04.438 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:04.699 [2024-06-10 10:06:26.346801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:04.699 BaseBdev1 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.699 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:04.960 [ 00:11:04.960 { 00:11:04.960 "name": "BaseBdev1", 00:11:04.960 "aliases": [ 00:11:04.960 "91c904d0-d1a6-4acc-b511-6fecef1aa926" 00:11:04.960 ], 00:11:04.960 "product_name": "Malloc disk", 00:11:04.960 "block_size": 512, 00:11:04.960 "num_blocks": 65536, 00:11:04.960 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:04.961 "assigned_rate_limits": { 00:11:04.961 "rw_ios_per_sec": 0, 00:11:04.961 "rw_mbytes_per_sec": 0, 00:11:04.961 "r_mbytes_per_sec": 0, 00:11:04.961 "w_mbytes_per_sec": 0 00:11:04.961 }, 00:11:04.961 "claimed": true, 00:11:04.961 "claim_type": "exclusive_write", 00:11:04.961 "zoned": false, 00:11:04.961 "supported_io_types": { 00:11:04.961 "read": true, 00:11:04.961 "write": true, 00:11:04.961 "unmap": true, 00:11:04.961 "write_zeroes": true, 00:11:04.961 "flush": true, 00:11:04.961 "reset": true, 00:11:04.961 "compare": false, 00:11:04.961 "compare_and_write": false, 00:11:04.961 "abort": true, 00:11:04.961 "nvme_admin": false, 00:11:04.961 "nvme_io": false 00:11:04.961 }, 00:11:04.961 "memory_domains": [ 00:11:04.961 { 00:11:04.961 "dma_device_id": "system", 00:11:04.961 "dma_device_type": 1 00:11:04.961 }, 00:11:04.961 { 00:11:04.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.961 "dma_device_type": 2 00:11:04.961 } 00:11:04.961 ], 00:11:04.961 "driver_specific": {} 00:11:04.961 } 00:11:04.961 ] 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.961 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.222 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.222 "name": "Existed_Raid", 00:11:05.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.222 "strip_size_kb": 64, 00:11:05.222 "state": "configuring", 00:11:05.222 "raid_level": "concat", 00:11:05.222 "superblock": false, 00:11:05.222 "num_base_bdevs": 2, 00:11:05.222 "num_base_bdevs_discovered": 1, 00:11:05.222 "num_base_bdevs_operational": 2, 00:11:05.222 "base_bdevs_list": [ 00:11:05.222 { 00:11:05.222 "name": "BaseBdev1", 00:11:05.222 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:05.222 "is_configured": true, 00:11:05.222 "data_offset": 0, 00:11:05.222 "data_size": 65536 00:11:05.222 }, 00:11:05.222 { 00:11:05.222 "name": "BaseBdev2", 00:11:05.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.222 "is_configured": false, 00:11:05.222 "data_offset": 0, 00:11:05.222 "data_size": 0 00:11:05.222 } 00:11:05.222 ] 00:11:05.222 }' 00:11:05.222 10:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.222 10:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.793 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:05.793 [2024-06-10 10:06:27.654087] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:05.793 [2024-06-10 10:06:27.654114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18653f0 name Existed_Raid, state configuring 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:06.053 [2024-06-10 10:06:27.846602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:06.053 [2024-06-10 10:06:27.847738] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:06.053 [2024-06-10 10:06:27.847761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.053 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.054 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.054 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.054 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.054 10:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.315 10:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.315 "name": "Existed_Raid", 00:11:06.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.315 "strip_size_kb": 64, 00:11:06.315 "state": "configuring", 00:11:06.315 "raid_level": "concat", 00:11:06.315 "superblock": false, 00:11:06.315 "num_base_bdevs": 2, 00:11:06.315 "num_base_bdevs_discovered": 1, 00:11:06.315 "num_base_bdevs_operational": 2, 00:11:06.315 "base_bdevs_list": [ 00:11:06.315 { 00:11:06.315 "name": "BaseBdev1", 00:11:06.315 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:06.315 "is_configured": true, 00:11:06.315 "data_offset": 0, 00:11:06.315 "data_size": 65536 00:11:06.315 }, 00:11:06.315 { 00:11:06.315 "name": "BaseBdev2", 00:11:06.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.315 "is_configured": false, 00:11:06.315 "data_offset": 0, 00:11:06.315 "data_size": 0 00:11:06.315 } 00:11:06.315 ] 00:11:06.315 }' 00:11:06.315 10:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.315 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.914 10:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:07.176 [2024-06-10 10:06:28.785765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:07.176 [2024-06-10 10:06:28.785789] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18661c0 00:11:07.176 [2024-06-10 10:06:28.785793] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:07.176 [2024-06-10 10:06:28.785946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a19220 00:11:07.176 [2024-06-10 10:06:28.786037] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18661c0 00:11:07.176 [2024-06-10 10:06:28.786043] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18661c0 00:11:07.176 [2024-06-10 10:06:28.786167] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.176 BaseBdev2 00:11:07.176 10:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:07.176 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:07.177 10:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:07.438 [ 00:11:07.438 { 00:11:07.438 "name": "BaseBdev2", 00:11:07.438 "aliases": [ 00:11:07.438 "388607e7-7a2b-4e5b-bf8b-4826003cdff9" 00:11:07.438 ], 00:11:07.438 "product_name": "Malloc disk", 00:11:07.438 "block_size": 512, 00:11:07.438 "num_blocks": 65536, 00:11:07.438 "uuid": "388607e7-7a2b-4e5b-bf8b-4826003cdff9", 00:11:07.438 "assigned_rate_limits": { 00:11:07.438 "rw_ios_per_sec": 0, 00:11:07.438 "rw_mbytes_per_sec": 0, 00:11:07.438 "r_mbytes_per_sec": 0, 00:11:07.438 "w_mbytes_per_sec": 0 00:11:07.438 }, 00:11:07.438 "claimed": true, 00:11:07.438 "claim_type": "exclusive_write", 00:11:07.438 "zoned": false, 00:11:07.438 "supported_io_types": { 00:11:07.438 "read": true, 00:11:07.438 "write": true, 00:11:07.438 "unmap": true, 00:11:07.438 "write_zeroes": true, 00:11:07.438 "flush": true, 00:11:07.438 "reset": true, 00:11:07.438 "compare": false, 00:11:07.438 "compare_and_write": false, 00:11:07.438 "abort": true, 00:11:07.438 "nvme_admin": false, 00:11:07.438 "nvme_io": false 00:11:07.438 }, 00:11:07.438 "memory_domains": [ 00:11:07.438 { 00:11:07.438 "dma_device_id": "system", 00:11:07.438 "dma_device_type": 1 00:11:07.438 }, 00:11:07.438 { 00:11:07.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.438 "dma_device_type": 2 00:11:07.438 } 00:11:07.438 ], 00:11:07.438 "driver_specific": {} 00:11:07.438 } 00:11:07.438 ] 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.438 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.698 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.698 "name": "Existed_Raid", 00:11:07.698 "uuid": "24b50b4b-014b-40cf-b5e1-fc64458d0f68", 00:11:07.698 "strip_size_kb": 64, 00:11:07.698 "state": "online", 00:11:07.698 "raid_level": "concat", 00:11:07.698 "superblock": false, 00:11:07.698 "num_base_bdevs": 2, 00:11:07.698 "num_base_bdevs_discovered": 2, 00:11:07.698 "num_base_bdevs_operational": 2, 00:11:07.698 "base_bdevs_list": [ 00:11:07.698 { 00:11:07.698 "name": "BaseBdev1", 00:11:07.698 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:07.698 "is_configured": true, 00:11:07.698 "data_offset": 0, 00:11:07.698 "data_size": 65536 00:11:07.698 }, 00:11:07.698 { 00:11:07.698 "name": "BaseBdev2", 00:11:07.698 "uuid": "388607e7-7a2b-4e5b-bf8b-4826003cdff9", 00:11:07.698 "is_configured": true, 00:11:07.698 "data_offset": 0, 00:11:07.698 "data_size": 65536 00:11:07.698 } 00:11:07.698 ] 00:11:07.698 }' 00:11:07.698 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.698 10:06:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:08.268 10:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:08.268 [2024-06-10 10:06:30.033115] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:08.268 "name": "Existed_Raid", 00:11:08.268 "aliases": [ 00:11:08.268 "24b50b4b-014b-40cf-b5e1-fc64458d0f68" 00:11:08.268 ], 00:11:08.268 "product_name": "Raid Volume", 00:11:08.268 "block_size": 512, 00:11:08.268 "num_blocks": 131072, 00:11:08.268 "uuid": "24b50b4b-014b-40cf-b5e1-fc64458d0f68", 00:11:08.268 "assigned_rate_limits": { 00:11:08.268 "rw_ios_per_sec": 0, 00:11:08.268 "rw_mbytes_per_sec": 0, 00:11:08.268 "r_mbytes_per_sec": 0, 00:11:08.268 "w_mbytes_per_sec": 0 00:11:08.268 }, 00:11:08.268 "claimed": false, 00:11:08.268 "zoned": false, 00:11:08.268 "supported_io_types": { 00:11:08.268 "read": true, 00:11:08.268 "write": true, 00:11:08.268 "unmap": true, 00:11:08.268 "write_zeroes": true, 00:11:08.268 "flush": true, 00:11:08.268 "reset": true, 00:11:08.268 "compare": false, 00:11:08.268 "compare_and_write": false, 00:11:08.268 "abort": false, 00:11:08.268 "nvme_admin": false, 00:11:08.268 "nvme_io": false 00:11:08.268 }, 00:11:08.268 "memory_domains": [ 00:11:08.268 { 00:11:08.268 "dma_device_id": "system", 00:11:08.268 "dma_device_type": 1 00:11:08.268 }, 00:11:08.268 { 00:11:08.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.268 "dma_device_type": 2 00:11:08.268 }, 00:11:08.268 { 00:11:08.268 "dma_device_id": "system", 00:11:08.268 "dma_device_type": 1 00:11:08.268 }, 00:11:08.268 { 00:11:08.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.268 "dma_device_type": 2 00:11:08.268 } 00:11:08.268 ], 00:11:08.268 "driver_specific": { 00:11:08.268 "raid": { 00:11:08.268 "uuid": "24b50b4b-014b-40cf-b5e1-fc64458d0f68", 00:11:08.268 "strip_size_kb": 64, 00:11:08.268 "state": "online", 00:11:08.268 "raid_level": "concat", 00:11:08.268 "superblock": false, 00:11:08.268 "num_base_bdevs": 2, 00:11:08.268 "num_base_bdevs_discovered": 2, 00:11:08.268 "num_base_bdevs_operational": 2, 00:11:08.268 "base_bdevs_list": [ 00:11:08.268 { 00:11:08.268 "name": "BaseBdev1", 00:11:08.268 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:08.268 "is_configured": true, 00:11:08.268 "data_offset": 0, 00:11:08.268 "data_size": 65536 00:11:08.268 }, 00:11:08.268 { 00:11:08.268 "name": "BaseBdev2", 00:11:08.268 "uuid": "388607e7-7a2b-4e5b-bf8b-4826003cdff9", 00:11:08.268 "is_configured": true, 00:11:08.268 "data_offset": 0, 00:11:08.268 "data_size": 65536 00:11:08.268 } 00:11:08.268 ] 00:11:08.268 } 00:11:08.268 } 00:11:08.268 }' 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:08.268 BaseBdev2' 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:08.268 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:08.528 "name": "BaseBdev1", 00:11:08.528 "aliases": [ 00:11:08.528 "91c904d0-d1a6-4acc-b511-6fecef1aa926" 00:11:08.528 ], 00:11:08.528 "product_name": "Malloc disk", 00:11:08.528 "block_size": 512, 00:11:08.528 "num_blocks": 65536, 00:11:08.528 "uuid": "91c904d0-d1a6-4acc-b511-6fecef1aa926", 00:11:08.528 "assigned_rate_limits": { 00:11:08.528 "rw_ios_per_sec": 0, 00:11:08.528 "rw_mbytes_per_sec": 0, 00:11:08.528 "r_mbytes_per_sec": 0, 00:11:08.528 "w_mbytes_per_sec": 0 00:11:08.528 }, 00:11:08.528 "claimed": true, 00:11:08.528 "claim_type": "exclusive_write", 00:11:08.528 "zoned": false, 00:11:08.528 "supported_io_types": { 00:11:08.528 "read": true, 00:11:08.528 "write": true, 00:11:08.528 "unmap": true, 00:11:08.528 "write_zeroes": true, 00:11:08.528 "flush": true, 00:11:08.528 "reset": true, 00:11:08.528 "compare": false, 00:11:08.528 "compare_and_write": false, 00:11:08.528 "abort": true, 00:11:08.528 "nvme_admin": false, 00:11:08.528 "nvme_io": false 00:11:08.528 }, 00:11:08.528 "memory_domains": [ 00:11:08.528 { 00:11:08.528 "dma_device_id": "system", 00:11:08.528 "dma_device_type": 1 00:11:08.528 }, 00:11:08.528 { 00:11:08.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.528 "dma_device_type": 2 00:11:08.528 } 00:11:08.528 ], 00:11:08.528 "driver_specific": {} 00:11:08.528 }' 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:08.528 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:08.788 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:09.048 "name": "BaseBdev2", 00:11:09.048 "aliases": [ 00:11:09.048 "388607e7-7a2b-4e5b-bf8b-4826003cdff9" 00:11:09.048 ], 00:11:09.048 "product_name": "Malloc disk", 00:11:09.048 "block_size": 512, 00:11:09.048 "num_blocks": 65536, 00:11:09.048 "uuid": "388607e7-7a2b-4e5b-bf8b-4826003cdff9", 00:11:09.048 "assigned_rate_limits": { 00:11:09.048 "rw_ios_per_sec": 0, 00:11:09.048 "rw_mbytes_per_sec": 0, 00:11:09.048 "r_mbytes_per_sec": 0, 00:11:09.048 "w_mbytes_per_sec": 0 00:11:09.048 }, 00:11:09.048 "claimed": true, 00:11:09.048 "claim_type": "exclusive_write", 00:11:09.048 "zoned": false, 00:11:09.048 "supported_io_types": { 00:11:09.048 "read": true, 00:11:09.048 "write": true, 00:11:09.048 "unmap": true, 00:11:09.048 "write_zeroes": true, 00:11:09.048 "flush": true, 00:11:09.048 "reset": true, 00:11:09.048 "compare": false, 00:11:09.048 "compare_and_write": false, 00:11:09.048 "abort": true, 00:11:09.048 "nvme_admin": false, 00:11:09.048 "nvme_io": false 00:11:09.048 }, 00:11:09.048 "memory_domains": [ 00:11:09.048 { 00:11:09.048 "dma_device_id": "system", 00:11:09.048 "dma_device_type": 1 00:11:09.048 }, 00:11:09.048 { 00:11:09.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.048 "dma_device_type": 2 00:11:09.048 } 00:11:09.048 ], 00:11:09.048 "driver_specific": {} 00:11:09.048 }' 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:09.048 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.308 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.308 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:09.308 10:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.308 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.308 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:09.308 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:09.568 [2024-06-10 10:06:31.223977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:09.568 [2024-06-10 10:06:31.223996] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.568 [2024-06-10 10:06:31.224027] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:09.568 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.569 "name": "Existed_Raid", 00:11:09.569 "uuid": "24b50b4b-014b-40cf-b5e1-fc64458d0f68", 00:11:09.569 "strip_size_kb": 64, 00:11:09.569 "state": "offline", 00:11:09.569 "raid_level": "concat", 00:11:09.569 "superblock": false, 00:11:09.569 "num_base_bdevs": 2, 00:11:09.569 "num_base_bdevs_discovered": 1, 00:11:09.569 "num_base_bdevs_operational": 1, 00:11:09.569 "base_bdevs_list": [ 00:11:09.569 { 00:11:09.569 "name": null, 00:11:09.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.569 "is_configured": false, 00:11:09.569 "data_offset": 0, 00:11:09.569 "data_size": 65536 00:11:09.569 }, 00:11:09.569 { 00:11:09.569 "name": "BaseBdev2", 00:11:09.569 "uuid": "388607e7-7a2b-4e5b-bf8b-4826003cdff9", 00:11:09.569 "is_configured": true, 00:11:09.569 "data_offset": 0, 00:11:09.569 "data_size": 65536 00:11:09.569 } 00:11:09.569 ] 00:11:09.569 }' 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.569 10:06:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.139 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:10.139 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:10.139 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:10.139 10:06:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.397 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:10.397 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:10.397 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:10.656 [2024-06-10 10:06:32.286670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:10.656 [2024-06-10 10:06:32.286706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18661c0 name Existed_Raid, state offline 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 963570 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 963570 ']' 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 963570 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:10.656 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 963570 00:11:10.916 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:10.916 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:10.916 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 963570' 00:11:10.916 killing process with pid 963570 00:11:10.916 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 963570 00:11:10.917 [2024-06-10 10:06:32.548706] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 963570 00:11:10.917 [2024-06-10 10:06:32.549316] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:10.917 00:11:10.917 real 0m8.685s 00:11:10.917 user 0m15.797s 00:11:10.917 sys 0m1.275s 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.917 ************************************ 00:11:10.917 END TEST raid_state_function_test 00:11:10.917 ************************************ 00:11:10.917 10:06:32 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:10.917 10:06:32 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:10.917 10:06:32 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:10.917 10:06:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:10.917 ************************************ 00:11:10.917 START TEST raid_state_function_test_sb 00:11:10.917 ************************************ 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 true 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=965326 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 965326' 00:11:10.917 Process raid pid: 965326 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 965326 /var/tmp/spdk-raid.sock 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 965326 ']' 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:10.917 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:10.917 10:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:11.176 [2024-06-10 10:06:32.805528] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:11.176 [2024-06-10 10:06:32.805573] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:11.176 [2024-06-10 10:06:32.894577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.176 [2024-06-10 10:06:32.958240] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.176 [2024-06-10 10:06:33.002977] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.176 [2024-06-10 10:06:33.002999] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:12.118 [2024-06-10 10:06:33.810372] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:12.118 [2024-06-10 10:06:33.810401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:12.118 [2024-06-10 10:06:33.810407] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:12.118 [2024-06-10 10:06:33.810413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.118 10:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.377 10:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.377 "name": "Existed_Raid", 00:11:12.377 "uuid": "d53385fb-5ca6-43ea-9bf4-b6891753c960", 00:11:12.377 "strip_size_kb": 64, 00:11:12.377 "state": "configuring", 00:11:12.377 "raid_level": "concat", 00:11:12.377 "superblock": true, 00:11:12.377 "num_base_bdevs": 2, 00:11:12.377 "num_base_bdevs_discovered": 0, 00:11:12.377 "num_base_bdevs_operational": 2, 00:11:12.377 "base_bdevs_list": [ 00:11:12.377 { 00:11:12.377 "name": "BaseBdev1", 00:11:12.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.377 "is_configured": false, 00:11:12.377 "data_offset": 0, 00:11:12.377 "data_size": 0 00:11:12.377 }, 00:11:12.377 { 00:11:12.377 "name": "BaseBdev2", 00:11:12.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.377 "is_configured": false, 00:11:12.377 "data_offset": 0, 00:11:12.377 "data_size": 0 00:11:12.377 } 00:11:12.377 ] 00:11:12.377 }' 00:11:12.377 10:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.377 10:06:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:12.945 10:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:12.945 [2024-06-10 10:06:34.700505] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:12.945 [2024-06-10 10:06:34.700522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2127b00 name Existed_Raid, state configuring 00:11:12.945 10:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:13.206 [2024-06-10 10:06:34.889007] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:13.206 [2024-06-10 10:06:34.889024] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:13.206 [2024-06-10 10:06:34.889029] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:13.206 [2024-06-10 10:06:34.889035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:13.206 10:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:13.466 [2024-06-10 10:06:35.084197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:13.466 BaseBdev1 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:13.466 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:13.726 [ 00:11:13.726 { 00:11:13.726 "name": "BaseBdev1", 00:11:13.726 "aliases": [ 00:11:13.726 "c408e83b-68c7-443a-a12e-a4e890033266" 00:11:13.726 ], 00:11:13.726 "product_name": "Malloc disk", 00:11:13.726 "block_size": 512, 00:11:13.726 "num_blocks": 65536, 00:11:13.726 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:13.726 "assigned_rate_limits": { 00:11:13.726 "rw_ios_per_sec": 0, 00:11:13.726 "rw_mbytes_per_sec": 0, 00:11:13.726 "r_mbytes_per_sec": 0, 00:11:13.726 "w_mbytes_per_sec": 0 00:11:13.726 }, 00:11:13.726 "claimed": true, 00:11:13.726 "claim_type": "exclusive_write", 00:11:13.726 "zoned": false, 00:11:13.726 "supported_io_types": { 00:11:13.726 "read": true, 00:11:13.726 "write": true, 00:11:13.726 "unmap": true, 00:11:13.726 "write_zeroes": true, 00:11:13.726 "flush": true, 00:11:13.726 "reset": true, 00:11:13.726 "compare": false, 00:11:13.726 "compare_and_write": false, 00:11:13.726 "abort": true, 00:11:13.726 "nvme_admin": false, 00:11:13.726 "nvme_io": false 00:11:13.726 }, 00:11:13.726 "memory_domains": [ 00:11:13.726 { 00:11:13.726 "dma_device_id": "system", 00:11:13.726 "dma_device_type": 1 00:11:13.726 }, 00:11:13.726 { 00:11:13.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.726 "dma_device_type": 2 00:11:13.726 } 00:11:13.726 ], 00:11:13.726 "driver_specific": {} 00:11:13.726 } 00:11:13.726 ] 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.726 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.986 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.986 "name": "Existed_Raid", 00:11:13.986 "uuid": "5828f192-e9b0-4c2f-b74b-92477deef75a", 00:11:13.986 "strip_size_kb": 64, 00:11:13.986 "state": "configuring", 00:11:13.986 "raid_level": "concat", 00:11:13.986 "superblock": true, 00:11:13.986 "num_base_bdevs": 2, 00:11:13.986 "num_base_bdevs_discovered": 1, 00:11:13.986 "num_base_bdevs_operational": 2, 00:11:13.986 "base_bdevs_list": [ 00:11:13.986 { 00:11:13.986 "name": "BaseBdev1", 00:11:13.986 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:13.986 "is_configured": true, 00:11:13.986 "data_offset": 2048, 00:11:13.986 "data_size": 63488 00:11:13.986 }, 00:11:13.986 { 00:11:13.986 "name": "BaseBdev2", 00:11:13.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.986 "is_configured": false, 00:11:13.986 "data_offset": 0, 00:11:13.986 "data_size": 0 00:11:13.986 } 00:11:13.986 ] 00:11:13.986 }' 00:11:13.986 10:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.986 10:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:14.556 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:14.556 [2024-06-10 10:06:36.391550] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:14.556 [2024-06-10 10:06:36.391574] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21273f0 name Existed_Raid, state configuring 00:11:14.556 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:14.818 [2024-06-10 10:06:36.572039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:14.818 [2024-06-10 10:06:36.573167] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:14.818 [2024-06-10 10:06:36.573191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.818 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.079 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.079 "name": "Existed_Raid", 00:11:15.079 "uuid": "857c0e31-9629-446f-a854-5aa7e9dbaff6", 00:11:15.079 "strip_size_kb": 64, 00:11:15.079 "state": "configuring", 00:11:15.079 "raid_level": "concat", 00:11:15.079 "superblock": true, 00:11:15.079 "num_base_bdevs": 2, 00:11:15.079 "num_base_bdevs_discovered": 1, 00:11:15.079 "num_base_bdevs_operational": 2, 00:11:15.079 "base_bdevs_list": [ 00:11:15.079 { 00:11:15.079 "name": "BaseBdev1", 00:11:15.079 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:15.079 "is_configured": true, 00:11:15.079 "data_offset": 2048, 00:11:15.079 "data_size": 63488 00:11:15.079 }, 00:11:15.079 { 00:11:15.079 "name": "BaseBdev2", 00:11:15.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.079 "is_configured": false, 00:11:15.079 "data_offset": 0, 00:11:15.079 "data_size": 0 00:11:15.079 } 00:11:15.079 ] 00:11:15.079 }' 00:11:15.079 10:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.079 10:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.650 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:15.650 [2024-06-10 10:06:37.515411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:15.650 [2024-06-10 10:06:37.515515] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21281c0 00:11:15.650 [2024-06-10 10:06:37.515523] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:15.650 [2024-06-10 10:06:37.515663] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22db220 00:11:15.650 [2024-06-10 10:06:37.515750] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21281c0 00:11:15.910 [2024-06-10 10:06:37.515756] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21281c0 00:11:15.910 [2024-06-10 10:06:37.515833] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.910 BaseBdev2 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:15.910 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:16.170 [ 00:11:16.170 { 00:11:16.170 "name": "BaseBdev2", 00:11:16.170 "aliases": [ 00:11:16.170 "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b" 00:11:16.170 ], 00:11:16.170 "product_name": "Malloc disk", 00:11:16.170 "block_size": 512, 00:11:16.170 "num_blocks": 65536, 00:11:16.170 "uuid": "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b", 00:11:16.170 "assigned_rate_limits": { 00:11:16.170 "rw_ios_per_sec": 0, 00:11:16.170 "rw_mbytes_per_sec": 0, 00:11:16.170 "r_mbytes_per_sec": 0, 00:11:16.170 "w_mbytes_per_sec": 0 00:11:16.170 }, 00:11:16.170 "claimed": true, 00:11:16.170 "claim_type": "exclusive_write", 00:11:16.170 "zoned": false, 00:11:16.170 "supported_io_types": { 00:11:16.170 "read": true, 00:11:16.170 "write": true, 00:11:16.170 "unmap": true, 00:11:16.170 "write_zeroes": true, 00:11:16.170 "flush": true, 00:11:16.170 "reset": true, 00:11:16.170 "compare": false, 00:11:16.170 "compare_and_write": false, 00:11:16.170 "abort": true, 00:11:16.170 "nvme_admin": false, 00:11:16.170 "nvme_io": false 00:11:16.170 }, 00:11:16.170 "memory_domains": [ 00:11:16.170 { 00:11:16.170 "dma_device_id": "system", 00:11:16.170 "dma_device_type": 1 00:11:16.170 }, 00:11:16.170 { 00:11:16.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.170 "dma_device_type": 2 00:11:16.171 } 00:11:16.171 ], 00:11:16.171 "driver_specific": {} 00:11:16.171 } 00:11:16.171 ] 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.171 10:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:16.431 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.431 "name": "Existed_Raid", 00:11:16.431 "uuid": "857c0e31-9629-446f-a854-5aa7e9dbaff6", 00:11:16.431 "strip_size_kb": 64, 00:11:16.431 "state": "online", 00:11:16.431 "raid_level": "concat", 00:11:16.431 "superblock": true, 00:11:16.431 "num_base_bdevs": 2, 00:11:16.431 "num_base_bdevs_discovered": 2, 00:11:16.431 "num_base_bdevs_operational": 2, 00:11:16.431 "base_bdevs_list": [ 00:11:16.431 { 00:11:16.431 "name": "BaseBdev1", 00:11:16.431 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:16.431 "is_configured": true, 00:11:16.431 "data_offset": 2048, 00:11:16.431 "data_size": 63488 00:11:16.431 }, 00:11:16.431 { 00:11:16.431 "name": "BaseBdev2", 00:11:16.431 "uuid": "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b", 00:11:16.431 "is_configured": true, 00:11:16.431 "data_offset": 2048, 00:11:16.431 "data_size": 63488 00:11:16.431 } 00:11:16.431 ] 00:11:16.431 }' 00:11:16.431 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.431 10:06:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:17.001 [2024-06-10 10:06:38.838967] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:17.001 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:17.001 "name": "Existed_Raid", 00:11:17.001 "aliases": [ 00:11:17.001 "857c0e31-9629-446f-a854-5aa7e9dbaff6" 00:11:17.001 ], 00:11:17.001 "product_name": "Raid Volume", 00:11:17.001 "block_size": 512, 00:11:17.001 "num_blocks": 126976, 00:11:17.001 "uuid": "857c0e31-9629-446f-a854-5aa7e9dbaff6", 00:11:17.001 "assigned_rate_limits": { 00:11:17.001 "rw_ios_per_sec": 0, 00:11:17.001 "rw_mbytes_per_sec": 0, 00:11:17.001 "r_mbytes_per_sec": 0, 00:11:17.001 "w_mbytes_per_sec": 0 00:11:17.001 }, 00:11:17.001 "claimed": false, 00:11:17.001 "zoned": false, 00:11:17.001 "supported_io_types": { 00:11:17.001 "read": true, 00:11:17.001 "write": true, 00:11:17.001 "unmap": true, 00:11:17.001 "write_zeroes": true, 00:11:17.001 "flush": true, 00:11:17.001 "reset": true, 00:11:17.001 "compare": false, 00:11:17.001 "compare_and_write": false, 00:11:17.001 "abort": false, 00:11:17.001 "nvme_admin": false, 00:11:17.001 "nvme_io": false 00:11:17.001 }, 00:11:17.001 "memory_domains": [ 00:11:17.001 { 00:11:17.001 "dma_device_id": "system", 00:11:17.001 "dma_device_type": 1 00:11:17.001 }, 00:11:17.001 { 00:11:17.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.001 "dma_device_type": 2 00:11:17.001 }, 00:11:17.001 { 00:11:17.001 "dma_device_id": "system", 00:11:17.001 "dma_device_type": 1 00:11:17.001 }, 00:11:17.001 { 00:11:17.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.001 "dma_device_type": 2 00:11:17.001 } 00:11:17.001 ], 00:11:17.001 "driver_specific": { 00:11:17.001 "raid": { 00:11:17.001 "uuid": "857c0e31-9629-446f-a854-5aa7e9dbaff6", 00:11:17.001 "strip_size_kb": 64, 00:11:17.001 "state": "online", 00:11:17.001 "raid_level": "concat", 00:11:17.001 "superblock": true, 00:11:17.001 "num_base_bdevs": 2, 00:11:17.002 "num_base_bdevs_discovered": 2, 00:11:17.002 "num_base_bdevs_operational": 2, 00:11:17.002 "base_bdevs_list": [ 00:11:17.002 { 00:11:17.002 "name": "BaseBdev1", 00:11:17.002 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:17.002 "is_configured": true, 00:11:17.002 "data_offset": 2048, 00:11:17.002 "data_size": 63488 00:11:17.002 }, 00:11:17.002 { 00:11:17.002 "name": "BaseBdev2", 00:11:17.002 "uuid": "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b", 00:11:17.002 "is_configured": true, 00:11:17.002 "data_offset": 2048, 00:11:17.002 "data_size": 63488 00:11:17.002 } 00:11:17.002 ] 00:11:17.002 } 00:11:17.002 } 00:11:17.002 }' 00:11:17.002 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:17.262 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:17.262 BaseBdev2' 00:11:17.262 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:17.262 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:17.262 10:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:17.262 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:17.262 "name": "BaseBdev1", 00:11:17.262 "aliases": [ 00:11:17.262 "c408e83b-68c7-443a-a12e-a4e890033266" 00:11:17.262 ], 00:11:17.262 "product_name": "Malloc disk", 00:11:17.262 "block_size": 512, 00:11:17.262 "num_blocks": 65536, 00:11:17.262 "uuid": "c408e83b-68c7-443a-a12e-a4e890033266", 00:11:17.262 "assigned_rate_limits": { 00:11:17.262 "rw_ios_per_sec": 0, 00:11:17.262 "rw_mbytes_per_sec": 0, 00:11:17.262 "r_mbytes_per_sec": 0, 00:11:17.262 "w_mbytes_per_sec": 0 00:11:17.262 }, 00:11:17.262 "claimed": true, 00:11:17.262 "claim_type": "exclusive_write", 00:11:17.262 "zoned": false, 00:11:17.262 "supported_io_types": { 00:11:17.262 "read": true, 00:11:17.262 "write": true, 00:11:17.262 "unmap": true, 00:11:17.262 "write_zeroes": true, 00:11:17.262 "flush": true, 00:11:17.262 "reset": true, 00:11:17.262 "compare": false, 00:11:17.262 "compare_and_write": false, 00:11:17.262 "abort": true, 00:11:17.262 "nvme_admin": false, 00:11:17.262 "nvme_io": false 00:11:17.262 }, 00:11:17.262 "memory_domains": [ 00:11:17.262 { 00:11:17.262 "dma_device_id": "system", 00:11:17.262 "dma_device_type": 1 00:11:17.262 }, 00:11:17.262 { 00:11:17.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.262 "dma_device_type": 2 00:11:17.262 } 00:11:17.262 ], 00:11:17.262 "driver_specific": {} 00:11:17.262 }' 00:11:17.262 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:17.522 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:17.782 "name": "BaseBdev2", 00:11:17.782 "aliases": [ 00:11:17.782 "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b" 00:11:17.782 ], 00:11:17.782 "product_name": "Malloc disk", 00:11:17.782 "block_size": 512, 00:11:17.782 "num_blocks": 65536, 00:11:17.782 "uuid": "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b", 00:11:17.782 "assigned_rate_limits": { 00:11:17.782 "rw_ios_per_sec": 0, 00:11:17.782 "rw_mbytes_per_sec": 0, 00:11:17.782 "r_mbytes_per_sec": 0, 00:11:17.782 "w_mbytes_per_sec": 0 00:11:17.782 }, 00:11:17.782 "claimed": true, 00:11:17.782 "claim_type": "exclusive_write", 00:11:17.782 "zoned": false, 00:11:17.782 "supported_io_types": { 00:11:17.782 "read": true, 00:11:17.782 "write": true, 00:11:17.782 "unmap": true, 00:11:17.782 "write_zeroes": true, 00:11:17.782 "flush": true, 00:11:17.782 "reset": true, 00:11:17.782 "compare": false, 00:11:17.782 "compare_and_write": false, 00:11:17.782 "abort": true, 00:11:17.782 "nvme_admin": false, 00:11:17.782 "nvme_io": false 00:11:17.782 }, 00:11:17.782 "memory_domains": [ 00:11:17.782 { 00:11:17.782 "dma_device_id": "system", 00:11:17.782 "dma_device_type": 1 00:11:17.782 }, 00:11:17.782 { 00:11:17.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.782 "dma_device_type": 2 00:11:17.782 } 00:11:17.782 ], 00:11:17.782 "driver_specific": {} 00:11:17.782 }' 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.782 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.043 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.303 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.303 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:18.303 10:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:18.303 [2024-06-10 10:06:40.150190] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:18.303 [2024-06-10 10:06:40.150211] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:18.303 [2024-06-10 10:06:40.150242] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.303 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.563 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.563 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.563 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.563 "name": "Existed_Raid", 00:11:18.563 "uuid": "857c0e31-9629-446f-a854-5aa7e9dbaff6", 00:11:18.563 "strip_size_kb": 64, 00:11:18.563 "state": "offline", 00:11:18.563 "raid_level": "concat", 00:11:18.563 "superblock": true, 00:11:18.563 "num_base_bdevs": 2, 00:11:18.563 "num_base_bdevs_discovered": 1, 00:11:18.563 "num_base_bdevs_operational": 1, 00:11:18.563 "base_bdevs_list": [ 00:11:18.563 { 00:11:18.563 "name": null, 00:11:18.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.563 "is_configured": false, 00:11:18.563 "data_offset": 2048, 00:11:18.563 "data_size": 63488 00:11:18.563 }, 00:11:18.563 { 00:11:18.563 "name": "BaseBdev2", 00:11:18.563 "uuid": "7c7e94bf-ab26-450b-bbd7-afce6f6ab86b", 00:11:18.563 "is_configured": true, 00:11:18.563 "data_offset": 2048, 00:11:18.563 "data_size": 63488 00:11:18.563 } 00:11:18.563 ] 00:11:18.563 }' 00:11:18.563 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.563 10:06:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.133 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:19.133 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:19.133 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.133 10:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:19.393 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:19.393 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:19.393 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:19.653 [2024-06-10 10:06:41.273013] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:19.653 [2024-06-10 10:06:41.273044] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21281c0 name Existed_Raid, state offline 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 965326 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 965326 ']' 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 965326 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:19.653 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 965326 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 965326' 00:11:19.914 killing process with pid 965326 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 965326 00:11:19.914 [2024-06-10 10:06:41.536839] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 965326 00:11:19.914 [2024-06-10 10:06:41.537438] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:19.914 00:11:19.914 real 0m8.912s 00:11:19.914 user 0m16.187s 00:11:19.914 sys 0m1.338s 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:19.914 10:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.914 ************************************ 00:11:19.914 END TEST raid_state_function_test_sb 00:11:19.914 ************************************ 00:11:19.914 10:06:41 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:19.914 10:06:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:11:19.914 10:06:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:19.914 10:06:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:19.914 ************************************ 00:11:19.914 START TEST raid_superblock_test 00:11:19.914 ************************************ 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 2 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=967086 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 967086 /var/tmp/spdk-raid.sock 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 967086 ']' 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:19.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:19.914 10:06:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.174 [2024-06-10 10:06:41.791339] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:20.174 [2024-06-10 10:06:41.791384] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid967086 ] 00:11:20.174 [2024-06-10 10:06:41.876399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:20.174 [2024-06-10 10:06:41.938818] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:20.174 [2024-06-10 10:06:41.978708] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:20.174 [2024-06-10 10:06:41.978731] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:21.117 malloc1 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:21.117 [2024-06-10 10:06:42.964607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:21.117 [2024-06-10 10:06:42.964640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:21.117 [2024-06-10 10:06:42.964651] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d41990 00:11:21.117 [2024-06-10 10:06:42.964658] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:21.117 [2024-06-10 10:06:42.965942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:21.117 [2024-06-10 10:06:42.965961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:21.117 pt1 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:21.117 10:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:21.377 malloc2 00:11:21.377 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:21.687 [2024-06-10 10:06:43.319404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:21.687 [2024-06-10 10:06:43.319431] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:21.687 [2024-06-10 10:06:43.319440] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d424e0 00:11:21.687 [2024-06-10 10:06:43.319447] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:21.687 [2024-06-10 10:06:43.320606] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:21.687 [2024-06-10 10:06:43.320624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:21.687 pt2 00:11:21.687 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:21.687 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:21.687 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:21.687 [2024-06-10 10:06:43.495861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:21.687 [2024-06-10 10:06:43.496858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:21.687 [2024-06-10 10:06:43.496965] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eeabc0 00:11:21.687 [2024-06-10 10:06:43.496973] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:21.687 [2024-06-10 10:06:43.497111] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eefbb0 00:11:21.687 [2024-06-10 10:06:43.497212] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eeabc0 00:11:21.687 [2024-06-10 10:06:43.497218] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1eeabc0 00:11:21.687 [2024-06-10 10:06:43.497283] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.688 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:21.956 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.956 "name": "raid_bdev1", 00:11:21.956 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:21.956 "strip_size_kb": 64, 00:11:21.956 "state": "online", 00:11:21.956 "raid_level": "concat", 00:11:21.956 "superblock": true, 00:11:21.956 "num_base_bdevs": 2, 00:11:21.956 "num_base_bdevs_discovered": 2, 00:11:21.956 "num_base_bdevs_operational": 2, 00:11:21.956 "base_bdevs_list": [ 00:11:21.956 { 00:11:21.956 "name": "pt1", 00:11:21.956 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:21.956 "is_configured": true, 00:11:21.956 "data_offset": 2048, 00:11:21.956 "data_size": 63488 00:11:21.956 }, 00:11:21.956 { 00:11:21.956 "name": "pt2", 00:11:21.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:21.956 "is_configured": true, 00:11:21.956 "data_offset": 2048, 00:11:21.956 "data_size": 63488 00:11:21.956 } 00:11:21.956 ] 00:11:21.956 }' 00:11:21.956 10:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.956 10:06:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:22.526 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:22.787 [2024-06-10 10:06:44.410324] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:22.787 "name": "raid_bdev1", 00:11:22.787 "aliases": [ 00:11:22.787 "afeee89c-2628-41a2-adc7-4245e288b949" 00:11:22.787 ], 00:11:22.787 "product_name": "Raid Volume", 00:11:22.787 "block_size": 512, 00:11:22.787 "num_blocks": 126976, 00:11:22.787 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:22.787 "assigned_rate_limits": { 00:11:22.787 "rw_ios_per_sec": 0, 00:11:22.787 "rw_mbytes_per_sec": 0, 00:11:22.787 "r_mbytes_per_sec": 0, 00:11:22.787 "w_mbytes_per_sec": 0 00:11:22.787 }, 00:11:22.787 "claimed": false, 00:11:22.787 "zoned": false, 00:11:22.787 "supported_io_types": { 00:11:22.787 "read": true, 00:11:22.787 "write": true, 00:11:22.787 "unmap": true, 00:11:22.787 "write_zeroes": true, 00:11:22.787 "flush": true, 00:11:22.787 "reset": true, 00:11:22.787 "compare": false, 00:11:22.787 "compare_and_write": false, 00:11:22.787 "abort": false, 00:11:22.787 "nvme_admin": false, 00:11:22.787 "nvme_io": false 00:11:22.787 }, 00:11:22.787 "memory_domains": [ 00:11:22.787 { 00:11:22.787 "dma_device_id": "system", 00:11:22.787 "dma_device_type": 1 00:11:22.787 }, 00:11:22.787 { 00:11:22.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.787 "dma_device_type": 2 00:11:22.787 }, 00:11:22.787 { 00:11:22.787 "dma_device_id": "system", 00:11:22.787 "dma_device_type": 1 00:11:22.787 }, 00:11:22.787 { 00:11:22.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.787 "dma_device_type": 2 00:11:22.787 } 00:11:22.787 ], 00:11:22.787 "driver_specific": { 00:11:22.787 "raid": { 00:11:22.787 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:22.787 "strip_size_kb": 64, 00:11:22.787 "state": "online", 00:11:22.787 "raid_level": "concat", 00:11:22.787 "superblock": true, 00:11:22.787 "num_base_bdevs": 2, 00:11:22.787 "num_base_bdevs_discovered": 2, 00:11:22.787 "num_base_bdevs_operational": 2, 00:11:22.787 "base_bdevs_list": [ 00:11:22.787 { 00:11:22.787 "name": "pt1", 00:11:22.787 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:22.787 "is_configured": true, 00:11:22.787 "data_offset": 2048, 00:11:22.787 "data_size": 63488 00:11:22.787 }, 00:11:22.787 { 00:11:22.787 "name": "pt2", 00:11:22.787 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:22.787 "is_configured": true, 00:11:22.787 "data_offset": 2048, 00:11:22.787 "data_size": 63488 00:11:22.787 } 00:11:22.787 ] 00:11:22.787 } 00:11:22.787 } 00:11:22.787 }' 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:22.787 pt2' 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:22.787 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:23.047 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:23.047 "name": "pt1", 00:11:23.047 "aliases": [ 00:11:23.047 "00000000-0000-0000-0000-000000000001" 00:11:23.047 ], 00:11:23.047 "product_name": "passthru", 00:11:23.047 "block_size": 512, 00:11:23.047 "num_blocks": 65536, 00:11:23.047 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:23.047 "assigned_rate_limits": { 00:11:23.047 "rw_ios_per_sec": 0, 00:11:23.047 "rw_mbytes_per_sec": 0, 00:11:23.047 "r_mbytes_per_sec": 0, 00:11:23.047 "w_mbytes_per_sec": 0 00:11:23.047 }, 00:11:23.047 "claimed": true, 00:11:23.047 "claim_type": "exclusive_write", 00:11:23.047 "zoned": false, 00:11:23.047 "supported_io_types": { 00:11:23.047 "read": true, 00:11:23.047 "write": true, 00:11:23.047 "unmap": true, 00:11:23.048 "write_zeroes": true, 00:11:23.048 "flush": true, 00:11:23.048 "reset": true, 00:11:23.048 "compare": false, 00:11:23.048 "compare_and_write": false, 00:11:23.048 "abort": true, 00:11:23.048 "nvme_admin": false, 00:11:23.048 "nvme_io": false 00:11:23.048 }, 00:11:23.048 "memory_domains": [ 00:11:23.048 { 00:11:23.048 "dma_device_id": "system", 00:11:23.048 "dma_device_type": 1 00:11:23.048 }, 00:11:23.048 { 00:11:23.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.048 "dma_device_type": 2 00:11:23.048 } 00:11:23.048 ], 00:11:23.048 "driver_specific": { 00:11:23.048 "passthru": { 00:11:23.048 "name": "pt1", 00:11:23.048 "base_bdev_name": "malloc1" 00:11:23.048 } 00:11:23.048 } 00:11:23.048 }' 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.048 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.308 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:23.308 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.308 10:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.308 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:23.308 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:23.308 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:23.308 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:23.568 "name": "pt2", 00:11:23.568 "aliases": [ 00:11:23.568 "00000000-0000-0000-0000-000000000002" 00:11:23.568 ], 00:11:23.568 "product_name": "passthru", 00:11:23.568 "block_size": 512, 00:11:23.568 "num_blocks": 65536, 00:11:23.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:23.568 "assigned_rate_limits": { 00:11:23.568 "rw_ios_per_sec": 0, 00:11:23.568 "rw_mbytes_per_sec": 0, 00:11:23.568 "r_mbytes_per_sec": 0, 00:11:23.568 "w_mbytes_per_sec": 0 00:11:23.568 }, 00:11:23.568 "claimed": true, 00:11:23.568 "claim_type": "exclusive_write", 00:11:23.568 "zoned": false, 00:11:23.568 "supported_io_types": { 00:11:23.568 "read": true, 00:11:23.568 "write": true, 00:11:23.568 "unmap": true, 00:11:23.568 "write_zeroes": true, 00:11:23.568 "flush": true, 00:11:23.568 "reset": true, 00:11:23.568 "compare": false, 00:11:23.568 "compare_and_write": false, 00:11:23.568 "abort": true, 00:11:23.568 "nvme_admin": false, 00:11:23.568 "nvme_io": false 00:11:23.568 }, 00:11:23.568 "memory_domains": [ 00:11:23.568 { 00:11:23.568 "dma_device_id": "system", 00:11:23.568 "dma_device_type": 1 00:11:23.568 }, 00:11:23.568 { 00:11:23.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.568 "dma_device_type": 2 00:11:23.568 } 00:11:23.568 ], 00:11:23.568 "driver_specific": { 00:11:23.568 "passthru": { 00:11:23.568 "name": "pt2", 00:11:23.568 "base_bdev_name": "malloc2" 00:11:23.568 } 00:11:23.568 } 00:11:23.568 }' 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.568 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:23.828 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:24.088 [2024-06-10 10:06:45.741674] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:24.088 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=afeee89c-2628-41a2-adc7-4245e288b949 00:11:24.088 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z afeee89c-2628-41a2-adc7-4245e288b949 ']' 00:11:24.088 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:24.088 [2024-06-10 10:06:45.934000] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:24.088 [2024-06-10 10:06:45.934012] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:24.088 [2024-06-10 10:06:45.934050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:24.088 [2024-06-10 10:06:45.934082] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:24.088 [2024-06-10 10:06:45.934088] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eeabc0 name raid_bdev1, state offline 00:11:24.348 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.348 10:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:24.348 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:24.348 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:24.348 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:24.348 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:24.608 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:24.608 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:24.869 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:25.129 [2024-06-10 10:06:46.904419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:25.129 [2024-06-10 10:06:46.905479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:25.129 [2024-06-10 10:06:46.905520] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:25.129 [2024-06-10 10:06:46.905549] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:25.129 [2024-06-10 10:06:46.905559] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:25.129 [2024-06-10 10:06:46.905564] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d403e0 name raid_bdev1, state configuring 00:11:25.129 request: 00:11:25.129 { 00:11:25.129 "name": "raid_bdev1", 00:11:25.129 "raid_level": "concat", 00:11:25.129 "base_bdevs": [ 00:11:25.129 "malloc1", 00:11:25.129 "malloc2" 00:11:25.129 ], 00:11:25.129 "superblock": false, 00:11:25.129 "strip_size_kb": 64, 00:11:25.129 "method": "bdev_raid_create", 00:11:25.129 "req_id": 1 00:11:25.129 } 00:11:25.129 Got JSON-RPC error response 00:11:25.129 response: 00:11:25.129 { 00:11:25.129 "code": -17, 00:11:25.129 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:25.129 } 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.129 10:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:25.388 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:25.388 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:25.389 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:25.648 [2024-06-10 10:06:47.285339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:25.648 [2024-06-10 10:06:47.285360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:25.648 [2024-06-10 10:06:47.285371] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eea900 00:11:25.648 [2024-06-10 10:06:47.285378] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:25.648 [2024-06-10 10:06:47.286620] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:25.648 [2024-06-10 10:06:47.286639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:25.648 [2024-06-10 10:06:47.286680] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:25.648 [2024-06-10 10:06:47.286697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:25.648 pt1 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.648 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.649 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.649 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:25.649 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.649 "name": "raid_bdev1", 00:11:25.649 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:25.649 "strip_size_kb": 64, 00:11:25.649 "state": "configuring", 00:11:25.649 "raid_level": "concat", 00:11:25.649 "superblock": true, 00:11:25.649 "num_base_bdevs": 2, 00:11:25.649 "num_base_bdevs_discovered": 1, 00:11:25.649 "num_base_bdevs_operational": 2, 00:11:25.649 "base_bdevs_list": [ 00:11:25.649 { 00:11:25.649 "name": "pt1", 00:11:25.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:25.649 "is_configured": true, 00:11:25.649 "data_offset": 2048, 00:11:25.649 "data_size": 63488 00:11:25.649 }, 00:11:25.649 { 00:11:25.649 "name": null, 00:11:25.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:25.649 "is_configured": false, 00:11:25.649 "data_offset": 2048, 00:11:25.649 "data_size": 63488 00:11:25.649 } 00:11:25.649 ] 00:11:25.649 }' 00:11:25.649 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.649 10:06:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.219 10:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:26.219 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:26.219 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:26.219 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:26.479 [2024-06-10 10:06:48.175595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:26.479 [2024-06-10 10:06:48.175622] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.479 [2024-06-10 10:06:48.175633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef0280 00:11:26.479 [2024-06-10 10:06:48.175639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.479 [2024-06-10 10:06:48.175894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.479 [2024-06-10 10:06:48.175905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:26.479 [2024-06-10 10:06:48.175944] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:26.479 [2024-06-10 10:06:48.175956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:26.479 [2024-06-10 10:06:48.176025] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ef29d0 00:11:26.479 [2024-06-10 10:06:48.176030] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:26.479 [2024-06-10 10:06:48.176164] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eeab90 00:11:26.479 [2024-06-10 10:06:48.176259] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ef29d0 00:11:26.479 [2024-06-10 10:06:48.176264] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ef29d0 00:11:26.479 [2024-06-10 10:06:48.176340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.479 pt2 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.479 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:26.739 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.739 "name": "raid_bdev1", 00:11:26.739 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:26.739 "strip_size_kb": 64, 00:11:26.739 "state": "online", 00:11:26.739 "raid_level": "concat", 00:11:26.739 "superblock": true, 00:11:26.739 "num_base_bdevs": 2, 00:11:26.739 "num_base_bdevs_discovered": 2, 00:11:26.739 "num_base_bdevs_operational": 2, 00:11:26.739 "base_bdevs_list": [ 00:11:26.739 { 00:11:26.739 "name": "pt1", 00:11:26.739 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:26.739 "is_configured": true, 00:11:26.739 "data_offset": 2048, 00:11:26.739 "data_size": 63488 00:11:26.739 }, 00:11:26.739 { 00:11:26.739 "name": "pt2", 00:11:26.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:26.739 "is_configured": true, 00:11:26.739 "data_offset": 2048, 00:11:26.739 "data_size": 63488 00:11:26.739 } 00:11:26.739 ] 00:11:26.739 }' 00:11:26.739 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.739 10:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:27.310 10:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:27.310 [2024-06-10 10:06:49.094093] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:27.310 "name": "raid_bdev1", 00:11:27.310 "aliases": [ 00:11:27.310 "afeee89c-2628-41a2-adc7-4245e288b949" 00:11:27.310 ], 00:11:27.310 "product_name": "Raid Volume", 00:11:27.310 "block_size": 512, 00:11:27.310 "num_blocks": 126976, 00:11:27.310 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:27.310 "assigned_rate_limits": { 00:11:27.310 "rw_ios_per_sec": 0, 00:11:27.310 "rw_mbytes_per_sec": 0, 00:11:27.310 "r_mbytes_per_sec": 0, 00:11:27.310 "w_mbytes_per_sec": 0 00:11:27.310 }, 00:11:27.310 "claimed": false, 00:11:27.310 "zoned": false, 00:11:27.310 "supported_io_types": { 00:11:27.310 "read": true, 00:11:27.310 "write": true, 00:11:27.310 "unmap": true, 00:11:27.310 "write_zeroes": true, 00:11:27.310 "flush": true, 00:11:27.310 "reset": true, 00:11:27.310 "compare": false, 00:11:27.310 "compare_and_write": false, 00:11:27.310 "abort": false, 00:11:27.310 "nvme_admin": false, 00:11:27.310 "nvme_io": false 00:11:27.310 }, 00:11:27.310 "memory_domains": [ 00:11:27.310 { 00:11:27.310 "dma_device_id": "system", 00:11:27.310 "dma_device_type": 1 00:11:27.310 }, 00:11:27.310 { 00:11:27.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.310 "dma_device_type": 2 00:11:27.310 }, 00:11:27.310 { 00:11:27.310 "dma_device_id": "system", 00:11:27.310 "dma_device_type": 1 00:11:27.310 }, 00:11:27.310 { 00:11:27.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.310 "dma_device_type": 2 00:11:27.310 } 00:11:27.310 ], 00:11:27.310 "driver_specific": { 00:11:27.310 "raid": { 00:11:27.310 "uuid": "afeee89c-2628-41a2-adc7-4245e288b949", 00:11:27.310 "strip_size_kb": 64, 00:11:27.310 "state": "online", 00:11:27.310 "raid_level": "concat", 00:11:27.310 "superblock": true, 00:11:27.310 "num_base_bdevs": 2, 00:11:27.310 "num_base_bdevs_discovered": 2, 00:11:27.310 "num_base_bdevs_operational": 2, 00:11:27.310 "base_bdevs_list": [ 00:11:27.310 { 00:11:27.310 "name": "pt1", 00:11:27.310 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.310 "is_configured": true, 00:11:27.310 "data_offset": 2048, 00:11:27.310 "data_size": 63488 00:11:27.310 }, 00:11:27.310 { 00:11:27.310 "name": "pt2", 00:11:27.310 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:27.310 "is_configured": true, 00:11:27.310 "data_offset": 2048, 00:11:27.310 "data_size": 63488 00:11:27.310 } 00:11:27.310 ] 00:11:27.310 } 00:11:27.310 } 00:11:27.310 }' 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:27.310 pt2' 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:27.310 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.570 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.570 "name": "pt1", 00:11:27.570 "aliases": [ 00:11:27.570 "00000000-0000-0000-0000-000000000001" 00:11:27.570 ], 00:11:27.570 "product_name": "passthru", 00:11:27.570 "block_size": 512, 00:11:27.570 "num_blocks": 65536, 00:11:27.570 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.570 "assigned_rate_limits": { 00:11:27.570 "rw_ios_per_sec": 0, 00:11:27.570 "rw_mbytes_per_sec": 0, 00:11:27.570 "r_mbytes_per_sec": 0, 00:11:27.570 "w_mbytes_per_sec": 0 00:11:27.570 }, 00:11:27.570 "claimed": true, 00:11:27.570 "claim_type": "exclusive_write", 00:11:27.570 "zoned": false, 00:11:27.570 "supported_io_types": { 00:11:27.570 "read": true, 00:11:27.570 "write": true, 00:11:27.570 "unmap": true, 00:11:27.570 "write_zeroes": true, 00:11:27.570 "flush": true, 00:11:27.570 "reset": true, 00:11:27.570 "compare": false, 00:11:27.570 "compare_and_write": false, 00:11:27.570 "abort": true, 00:11:27.570 "nvme_admin": false, 00:11:27.570 "nvme_io": false 00:11:27.570 }, 00:11:27.570 "memory_domains": [ 00:11:27.570 { 00:11:27.571 "dma_device_id": "system", 00:11:27.571 "dma_device_type": 1 00:11:27.571 }, 00:11:27.571 { 00:11:27.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.571 "dma_device_type": 2 00:11:27.571 } 00:11:27.571 ], 00:11:27.571 "driver_specific": { 00:11:27.571 "passthru": { 00:11:27.571 "name": "pt1", 00:11:27.571 "base_bdev_name": "malloc1" 00:11:27.571 } 00:11:27.571 } 00:11:27.571 }' 00:11:27.571 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.571 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.571 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.571 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.830 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.830 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:27.831 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.090 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.090 "name": "pt2", 00:11:28.090 "aliases": [ 00:11:28.090 "00000000-0000-0000-0000-000000000002" 00:11:28.090 ], 00:11:28.090 "product_name": "passthru", 00:11:28.090 "block_size": 512, 00:11:28.090 "num_blocks": 65536, 00:11:28.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:28.090 "assigned_rate_limits": { 00:11:28.090 "rw_ios_per_sec": 0, 00:11:28.090 "rw_mbytes_per_sec": 0, 00:11:28.090 "r_mbytes_per_sec": 0, 00:11:28.090 "w_mbytes_per_sec": 0 00:11:28.090 }, 00:11:28.090 "claimed": true, 00:11:28.090 "claim_type": "exclusive_write", 00:11:28.090 "zoned": false, 00:11:28.090 "supported_io_types": { 00:11:28.090 "read": true, 00:11:28.090 "write": true, 00:11:28.090 "unmap": true, 00:11:28.090 "write_zeroes": true, 00:11:28.090 "flush": true, 00:11:28.090 "reset": true, 00:11:28.090 "compare": false, 00:11:28.090 "compare_and_write": false, 00:11:28.090 "abort": true, 00:11:28.090 "nvme_admin": false, 00:11:28.090 "nvme_io": false 00:11:28.090 }, 00:11:28.090 "memory_domains": [ 00:11:28.090 { 00:11:28.090 "dma_device_id": "system", 00:11:28.090 "dma_device_type": 1 00:11:28.090 }, 00:11:28.090 { 00:11:28.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.090 "dma_device_type": 2 00:11:28.090 } 00:11:28.090 ], 00:11:28.090 "driver_specific": { 00:11:28.090 "passthru": { 00:11:28.090 "name": "pt2", 00:11:28.091 "base_bdev_name": "malloc2" 00:11:28.091 } 00:11:28.091 } 00:11:28.091 }' 00:11:28.091 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.091 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.351 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.351 10:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.351 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:28.612 [2024-06-10 10:06:50.397421] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' afeee89c-2628-41a2-adc7-4245e288b949 '!=' afeee89c-2628-41a2-adc7-4245e288b949 ']' 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 967086 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 967086 ']' 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 967086 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 967086 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 967086' 00:11:28.612 killing process with pid 967086 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 967086 00:11:28.612 [2024-06-10 10:06:50.468749] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:28.612 [2024-06-10 10:06:50.468787] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:28.612 [2024-06-10 10:06:50.468818] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:28.612 [2024-06-10 10:06:50.468829] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ef29d0 name raid_bdev1, state offline 00:11:28.612 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 967086 00:11:28.612 [2024-06-10 10:06:50.478031] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:28.874 10:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:28.874 00:11:28.874 real 0m8.860s 00:11:28.874 user 0m16.162s 00:11:28.874 sys 0m1.341s 00:11:28.874 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:28.874 10:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.874 ************************************ 00:11:28.874 END TEST raid_superblock_test 00:11:28.874 ************************************ 00:11:28.874 10:06:50 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:11:28.874 10:06:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:28.874 10:06:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:28.874 10:06:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:28.874 ************************************ 00:11:28.874 START TEST raid_read_error_test 00:11:28.874 ************************************ 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 read 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ydpP1NszIp 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=968836 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 968836 /var/tmp/spdk-raid.sock 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 968836 ']' 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:28.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:28.874 10:06:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.874 [2024-06-10 10:06:50.734023] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:28.874 [2024-06-10 10:06:50.734076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid968836 ] 00:11:29.135 [2024-06-10 10:06:50.824896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:29.135 [2024-06-10 10:06:50.890452] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.135 [2024-06-10 10:06:50.929754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.135 [2024-06-10 10:06:50.929776] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.707 10:06:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:29.707 10:06:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:29.707 10:06:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:29.707 10:06:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:29.968 BaseBdev1_malloc 00:11:29.968 10:06:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:30.228 true 00:11:30.228 10:06:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:30.488 [2024-06-10 10:06:52.116348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:30.488 [2024-06-10 10:06:52.116382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.488 [2024-06-10 10:06:52.116393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd08d10 00:11:30.488 [2024-06-10 10:06:52.116399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.488 [2024-06-10 10:06:52.117737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.488 [2024-06-10 10:06:52.117757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:30.488 BaseBdev1 00:11:30.488 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:30.488 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:30.488 BaseBdev2_malloc 00:11:30.489 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:30.749 true 00:11:30.749 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:31.009 [2024-06-10 10:06:52.659370] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:31.009 [2024-06-10 10:06:52.659407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:31.009 [2024-06-10 10:06:52.659418] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0d710 00:11:31.009 [2024-06-10 10:06:52.659424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:31.009 [2024-06-10 10:06:52.660589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:31.009 [2024-06-10 10:06:52.660607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:31.009 BaseBdev2 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:31.009 [2024-06-10 10:06:52.839852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:31.009 [2024-06-10 10:06:52.840838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:31.009 [2024-06-10 10:06:52.840975] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd0e9f0 00:11:31.009 [2024-06-10 10:06:52.840983] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:31.009 [2024-06-10 10:06:52.841124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0ecd0 00:11:31.009 [2024-06-10 10:06:52.841234] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd0e9f0 00:11:31.009 [2024-06-10 10:06:52.841239] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd0e9f0 00:11:31.009 [2024-06-10 10:06:52.841311] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.009 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.010 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.010 10:06:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.270 10:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.270 "name": "raid_bdev1", 00:11:31.270 "uuid": "c0605988-8362-4ab1-95a1-8d6cb7e8faea", 00:11:31.270 "strip_size_kb": 64, 00:11:31.270 "state": "online", 00:11:31.270 "raid_level": "concat", 00:11:31.270 "superblock": true, 00:11:31.270 "num_base_bdevs": 2, 00:11:31.270 "num_base_bdevs_discovered": 2, 00:11:31.270 "num_base_bdevs_operational": 2, 00:11:31.270 "base_bdevs_list": [ 00:11:31.270 { 00:11:31.270 "name": "BaseBdev1", 00:11:31.270 "uuid": "8f9a5ff3-e437-5447-bd64-335aeeed45b6", 00:11:31.270 "is_configured": true, 00:11:31.270 "data_offset": 2048, 00:11:31.270 "data_size": 63488 00:11:31.270 }, 00:11:31.270 { 00:11:31.270 "name": "BaseBdev2", 00:11:31.270 "uuid": "a8a0eb31-cbf9-59a3-9fa4-6c08e03b2790", 00:11:31.270 "is_configured": true, 00:11:31.270 "data_offset": 2048, 00:11:31.270 "data_size": 63488 00:11:31.270 } 00:11:31.270 ] 00:11:31.270 }' 00:11:31.270 10:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.270 10:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.842 10:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:31.842 10:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:31.842 [2024-06-10 10:06:53.690177] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb5e600 00:11:32.783 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.044 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:33.305 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.305 "name": "raid_bdev1", 00:11:33.305 "uuid": "c0605988-8362-4ab1-95a1-8d6cb7e8faea", 00:11:33.305 "strip_size_kb": 64, 00:11:33.305 "state": "online", 00:11:33.305 "raid_level": "concat", 00:11:33.305 "superblock": true, 00:11:33.305 "num_base_bdevs": 2, 00:11:33.305 "num_base_bdevs_discovered": 2, 00:11:33.305 "num_base_bdevs_operational": 2, 00:11:33.305 "base_bdevs_list": [ 00:11:33.305 { 00:11:33.305 "name": "BaseBdev1", 00:11:33.305 "uuid": "8f9a5ff3-e437-5447-bd64-335aeeed45b6", 00:11:33.305 "is_configured": true, 00:11:33.305 "data_offset": 2048, 00:11:33.305 "data_size": 63488 00:11:33.305 }, 00:11:33.305 { 00:11:33.305 "name": "BaseBdev2", 00:11:33.305 "uuid": "a8a0eb31-cbf9-59a3-9fa4-6c08e03b2790", 00:11:33.305 "is_configured": true, 00:11:33.305 "data_offset": 2048, 00:11:33.305 "data_size": 63488 00:11:33.305 } 00:11:33.305 ] 00:11:33.305 }' 00:11:33.305 10:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.305 10:06:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:33.876 [2024-06-10 10:06:55.677451] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:33.876 [2024-06-10 10:06:55.677482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.876 [2024-06-10 10:06:55.680095] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.876 [2024-06-10 10:06:55.680119] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:33.876 [2024-06-10 10:06:55.680138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:33.876 [2024-06-10 10:06:55.680145] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd0e9f0 name raid_bdev1, state offline 00:11:33.876 0 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 968836 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 968836 ']' 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 968836 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 968836 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 968836' 00:11:33.876 killing process with pid 968836 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 968836 00:11:33.876 [2024-06-10 10:06:55.737559] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:33.876 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 968836 00:11:34.138 [2024-06-10 10:06:55.743369] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ydpP1NszIp 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:34.138 00:11:34.138 real 0m5.209s 00:11:34.138 user 0m8.180s 00:11:34.138 sys 0m0.724s 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:34.138 10:06:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.138 ************************************ 00:11:34.138 END TEST raid_read_error_test 00:11:34.138 ************************************ 00:11:34.138 10:06:55 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:11:34.138 10:06:55 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:34.138 10:06:55 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:34.138 10:06:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.138 ************************************ 00:11:34.138 START TEST raid_write_error_test 00:11:34.138 ************************************ 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 write 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:34.138 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7ALf0Tspyx 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=969850 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 969850 /var/tmp/spdk-raid.sock 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 969850 ']' 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:34.139 10:06:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.400 [2024-06-10 10:06:56.018589] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:34.400 [2024-06-10 10:06:56.018636] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid969850 ] 00:11:34.400 [2024-06-10 10:06:56.105585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:34.400 [2024-06-10 10:06:56.167779] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.400 [2024-06-10 10:06:56.206255] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.400 [2024-06-10 10:06:56.206279] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.343 10:06:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:35.343 10:06:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:35.343 10:06:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:35.343 10:06:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:35.343 BaseBdev1_malloc 00:11:35.343 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:35.343 true 00:11:35.343 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:35.603 [2024-06-10 10:06:57.336412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:35.603 [2024-06-10 10:06:57.336448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:35.603 [2024-06-10 10:06:57.336459] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b13d10 00:11:35.603 [2024-06-10 10:06:57.336466] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:35.603 [2024-06-10 10:06:57.337811] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:35.603 [2024-06-10 10:06:57.337836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:35.603 BaseBdev1 00:11:35.603 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:35.603 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:35.864 BaseBdev2_malloc 00:11:35.864 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:35.864 true 00:11:35.864 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:36.125 [2024-06-10 10:06:57.859347] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:36.125 [2024-06-10 10:06:57.859375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.125 [2024-06-10 10:06:57.859385] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b18710 00:11:36.125 [2024-06-10 10:06:57.859390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.125 [2024-06-10 10:06:57.860535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.125 [2024-06-10 10:06:57.860552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:36.125 BaseBdev2 00:11:36.125 10:06:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:36.385 [2024-06-10 10:06:58.039831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:36.385 [2024-06-10 10:06:58.040800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:36.385 [2024-06-10 10:06:58.040942] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b199f0 00:11:36.386 [2024-06-10 10:06:58.040951] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:36.386 [2024-06-10 10:06:58.041089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b19cd0 00:11:36.386 [2024-06-10 10:06:58.041198] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b199f0 00:11:36.386 [2024-06-10 10:06:58.041203] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b199f0 00:11:36.386 [2024-06-10 10:06:58.041276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.386 "name": "raid_bdev1", 00:11:36.386 "uuid": "b796d2f9-e643-4e06-8be9-471afd004027", 00:11:36.386 "strip_size_kb": 64, 00:11:36.386 "state": "online", 00:11:36.386 "raid_level": "concat", 00:11:36.386 "superblock": true, 00:11:36.386 "num_base_bdevs": 2, 00:11:36.386 "num_base_bdevs_discovered": 2, 00:11:36.386 "num_base_bdevs_operational": 2, 00:11:36.386 "base_bdevs_list": [ 00:11:36.386 { 00:11:36.386 "name": "BaseBdev1", 00:11:36.386 "uuid": "c28a9d59-3ed2-5ef1-941d-d6d7d600f668", 00:11:36.386 "is_configured": true, 00:11:36.386 "data_offset": 2048, 00:11:36.386 "data_size": 63488 00:11:36.386 }, 00:11:36.386 { 00:11:36.386 "name": "BaseBdev2", 00:11:36.386 "uuid": "421e3921-cdd1-5cd1-b06a-27c71333f417", 00:11:36.386 "is_configured": true, 00:11:36.386 "data_offset": 2048, 00:11:36.386 "data_size": 63488 00:11:36.386 } 00:11:36.386 ] 00:11:36.386 }' 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.386 10:06:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.956 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:36.956 10:06:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:37.216 [2024-06-10 10:06:58.882141] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1969600 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.157 10:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:38.417 10:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.417 "name": "raid_bdev1", 00:11:38.417 "uuid": "b796d2f9-e643-4e06-8be9-471afd004027", 00:11:38.417 "strip_size_kb": 64, 00:11:38.417 "state": "online", 00:11:38.417 "raid_level": "concat", 00:11:38.417 "superblock": true, 00:11:38.417 "num_base_bdevs": 2, 00:11:38.417 "num_base_bdevs_discovered": 2, 00:11:38.417 "num_base_bdevs_operational": 2, 00:11:38.417 "base_bdevs_list": [ 00:11:38.417 { 00:11:38.417 "name": "BaseBdev1", 00:11:38.417 "uuid": "c28a9d59-3ed2-5ef1-941d-d6d7d600f668", 00:11:38.417 "is_configured": true, 00:11:38.417 "data_offset": 2048, 00:11:38.417 "data_size": 63488 00:11:38.417 }, 00:11:38.417 { 00:11:38.417 "name": "BaseBdev2", 00:11:38.417 "uuid": "421e3921-cdd1-5cd1-b06a-27c71333f417", 00:11:38.417 "is_configured": true, 00:11:38.417 "data_offset": 2048, 00:11:38.417 "data_size": 63488 00:11:38.417 } 00:11:38.417 ] 00:11:38.417 }' 00:11:38.417 10:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.417 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.035 10:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:39.035 [2024-06-10 10:07:00.900339] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:39.035 [2024-06-10 10:07:00.900377] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:39.295 [2024-06-10 10:07:00.903055] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.295 [2024-06-10 10:07:00.903078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:39.295 [2024-06-10 10:07:00.903097] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.295 [2024-06-10 10:07:00.903102] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b199f0 name raid_bdev1, state offline 00:11:39.295 0 00:11:39.295 10:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 969850 00:11:39.295 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 969850 ']' 00:11:39.295 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 969850 00:11:39.295 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:11:39.295 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 969850 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 969850' 00:11:39.296 killing process with pid 969850 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 969850 00:11:39.296 [2024-06-10 10:07:00.968390] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:39.296 10:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 969850 00:11:39.296 [2024-06-10 10:07:00.974240] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7ALf0Tspyx 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:39.296 00:11:39.296 real 0m5.157s 00:11:39.296 user 0m8.072s 00:11:39.296 sys 0m0.727s 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:39.296 10:07:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.296 ************************************ 00:11:39.296 END TEST raid_write_error_test 00:11:39.296 ************************************ 00:11:39.296 10:07:01 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:39.296 10:07:01 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:39.296 10:07:01 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:39.296 10:07:01 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:39.296 10:07:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:39.556 ************************************ 00:11:39.556 START TEST raid_state_function_test 00:11:39.556 ************************************ 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 false 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=970865 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 970865' 00:11:39.556 Process raid pid: 970865 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 970865 /var/tmp/spdk-raid.sock 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 970865 ']' 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:39.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:39.556 10:07:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.556 [2024-06-10 10:07:01.255877] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:39.556 [2024-06-10 10:07:01.255923] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:39.556 [2024-06-10 10:07:01.342789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:39.556 [2024-06-10 10:07:01.404899] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:39.815 [2024-06-10 10:07:01.442981] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:39.815 [2024-06-10 10:07:01.443002] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.385 10:07:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:40.385 10:07:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:11:40.385 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:40.385 [2024-06-10 10:07:02.249876] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:40.385 [2024-06-10 10:07:02.249907] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:40.386 [2024-06-10 10:07:02.249912] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:40.386 [2024-06-10 10:07:02.249918] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.646 "name": "Existed_Raid", 00:11:40.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.646 "strip_size_kb": 0, 00:11:40.646 "state": "configuring", 00:11:40.646 "raid_level": "raid1", 00:11:40.646 "superblock": false, 00:11:40.646 "num_base_bdevs": 2, 00:11:40.646 "num_base_bdevs_discovered": 0, 00:11:40.646 "num_base_bdevs_operational": 2, 00:11:40.646 "base_bdevs_list": [ 00:11:40.646 { 00:11:40.646 "name": "BaseBdev1", 00:11:40.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.646 "is_configured": false, 00:11:40.646 "data_offset": 0, 00:11:40.646 "data_size": 0 00:11:40.646 }, 00:11:40.646 { 00:11:40.646 "name": "BaseBdev2", 00:11:40.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.646 "is_configured": false, 00:11:40.646 "data_offset": 0, 00:11:40.646 "data_size": 0 00:11:40.646 } 00:11:40.646 ] 00:11:40.646 }' 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.646 10:07:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.224 10:07:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:41.483 [2024-06-10 10:07:03.148055] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:41.483 [2024-06-10 10:07:03.148069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce6b00 name Existed_Raid, state configuring 00:11:41.483 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:41.483 [2024-06-10 10:07:03.340546] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:41.483 [2024-06-10 10:07:03.340564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:41.483 [2024-06-10 10:07:03.340569] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:41.483 [2024-06-10 10:07:03.340575] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:41.743 [2024-06-10 10:07:03.527494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:41.743 BaseBdev1 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:41.743 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:42.002 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:42.262 [ 00:11:42.262 { 00:11:42.262 "name": "BaseBdev1", 00:11:42.262 "aliases": [ 00:11:42.262 "9e8ee540-0a32-4690-8910-b2202b89737d" 00:11:42.262 ], 00:11:42.262 "product_name": "Malloc disk", 00:11:42.262 "block_size": 512, 00:11:42.262 "num_blocks": 65536, 00:11:42.262 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:42.262 "assigned_rate_limits": { 00:11:42.262 "rw_ios_per_sec": 0, 00:11:42.262 "rw_mbytes_per_sec": 0, 00:11:42.262 "r_mbytes_per_sec": 0, 00:11:42.262 "w_mbytes_per_sec": 0 00:11:42.262 }, 00:11:42.262 "claimed": true, 00:11:42.262 "claim_type": "exclusive_write", 00:11:42.262 "zoned": false, 00:11:42.262 "supported_io_types": { 00:11:42.262 "read": true, 00:11:42.262 "write": true, 00:11:42.262 "unmap": true, 00:11:42.262 "write_zeroes": true, 00:11:42.262 "flush": true, 00:11:42.262 "reset": true, 00:11:42.262 "compare": false, 00:11:42.262 "compare_and_write": false, 00:11:42.262 "abort": true, 00:11:42.262 "nvme_admin": false, 00:11:42.262 "nvme_io": false 00:11:42.262 }, 00:11:42.262 "memory_domains": [ 00:11:42.262 { 00:11:42.262 "dma_device_id": "system", 00:11:42.262 "dma_device_type": 1 00:11:42.262 }, 00:11:42.262 { 00:11:42.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.262 "dma_device_type": 2 00:11:42.262 } 00:11:42.262 ], 00:11:42.262 "driver_specific": {} 00:11:42.262 } 00:11:42.262 ] 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.262 10:07:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.262 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.262 "name": "Existed_Raid", 00:11:42.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.262 "strip_size_kb": 0, 00:11:42.262 "state": "configuring", 00:11:42.262 "raid_level": "raid1", 00:11:42.262 "superblock": false, 00:11:42.262 "num_base_bdevs": 2, 00:11:42.262 "num_base_bdevs_discovered": 1, 00:11:42.262 "num_base_bdevs_operational": 2, 00:11:42.262 "base_bdevs_list": [ 00:11:42.262 { 00:11:42.262 "name": "BaseBdev1", 00:11:42.262 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:42.262 "is_configured": true, 00:11:42.262 "data_offset": 0, 00:11:42.262 "data_size": 65536 00:11:42.262 }, 00:11:42.262 { 00:11:42.262 "name": "BaseBdev2", 00:11:42.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.262 "is_configured": false, 00:11:42.262 "data_offset": 0, 00:11:42.262 "data_size": 0 00:11:42.262 } 00:11:42.262 ] 00:11:42.262 }' 00:11:42.262 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.262 10:07:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.832 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:43.095 [2024-06-10 10:07:04.778656] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:43.095 [2024-06-10 10:07:04.778681] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce63f0 name Existed_Raid, state configuring 00:11:43.095 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:43.095 [2024-06-10 10:07:04.959132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:43.095 [2024-06-10 10:07:04.960298] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:43.095 [2024-06-10 10:07:04.960322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.388 10:07:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.388 10:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.388 "name": "Existed_Raid", 00:11:43.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.388 "strip_size_kb": 0, 00:11:43.388 "state": "configuring", 00:11:43.388 "raid_level": "raid1", 00:11:43.388 "superblock": false, 00:11:43.388 "num_base_bdevs": 2, 00:11:43.388 "num_base_bdevs_discovered": 1, 00:11:43.388 "num_base_bdevs_operational": 2, 00:11:43.388 "base_bdevs_list": [ 00:11:43.388 { 00:11:43.388 "name": "BaseBdev1", 00:11:43.388 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:43.388 "is_configured": true, 00:11:43.388 "data_offset": 0, 00:11:43.388 "data_size": 65536 00:11:43.388 }, 00:11:43.388 { 00:11:43.388 "name": "BaseBdev2", 00:11:43.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.388 "is_configured": false, 00:11:43.388 "data_offset": 0, 00:11:43.388 "data_size": 0 00:11:43.388 } 00:11:43.388 ] 00:11:43.388 }' 00:11:43.388 10:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.388 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.962 10:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:44.223 [2024-06-10 10:07:05.898448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:44.223 [2024-06-10 10:07:05.898472] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xce71c0 00:11:44.223 [2024-06-10 10:07:05.898476] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:44.223 [2024-06-10 10:07:05.898621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe9a220 00:11:44.223 [2024-06-10 10:07:05.898713] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xce71c0 00:11:44.223 [2024-06-10 10:07:05.898718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xce71c0 00:11:44.223 [2024-06-10 10:07:05.898848] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:44.223 BaseBdev2 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:44.223 10:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:44.568 10:07:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:44.568 [ 00:11:44.568 { 00:11:44.568 "name": "BaseBdev2", 00:11:44.568 "aliases": [ 00:11:44.568 "4e21c5e7-b373-41b0-af6c-470926116a67" 00:11:44.568 ], 00:11:44.568 "product_name": "Malloc disk", 00:11:44.568 "block_size": 512, 00:11:44.568 "num_blocks": 65536, 00:11:44.568 "uuid": "4e21c5e7-b373-41b0-af6c-470926116a67", 00:11:44.568 "assigned_rate_limits": { 00:11:44.568 "rw_ios_per_sec": 0, 00:11:44.568 "rw_mbytes_per_sec": 0, 00:11:44.568 "r_mbytes_per_sec": 0, 00:11:44.568 "w_mbytes_per_sec": 0 00:11:44.568 }, 00:11:44.568 "claimed": true, 00:11:44.568 "claim_type": "exclusive_write", 00:11:44.568 "zoned": false, 00:11:44.568 "supported_io_types": { 00:11:44.568 "read": true, 00:11:44.568 "write": true, 00:11:44.568 "unmap": true, 00:11:44.568 "write_zeroes": true, 00:11:44.568 "flush": true, 00:11:44.568 "reset": true, 00:11:44.568 "compare": false, 00:11:44.568 "compare_and_write": false, 00:11:44.568 "abort": true, 00:11:44.568 "nvme_admin": false, 00:11:44.568 "nvme_io": false 00:11:44.568 }, 00:11:44.568 "memory_domains": [ 00:11:44.568 { 00:11:44.568 "dma_device_id": "system", 00:11:44.568 "dma_device_type": 1 00:11:44.568 }, 00:11:44.568 { 00:11:44.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.569 "dma_device_type": 2 00:11:44.569 } 00:11:44.569 ], 00:11:44.569 "driver_specific": {} 00:11:44.569 } 00:11:44.569 ] 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.569 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.829 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.829 "name": "Existed_Raid", 00:11:44.829 "uuid": "188ca0c7-2e1c-4606-89e8-1c33925fda2f", 00:11:44.829 "strip_size_kb": 0, 00:11:44.829 "state": "online", 00:11:44.829 "raid_level": "raid1", 00:11:44.829 "superblock": false, 00:11:44.829 "num_base_bdevs": 2, 00:11:44.829 "num_base_bdevs_discovered": 2, 00:11:44.829 "num_base_bdevs_operational": 2, 00:11:44.829 "base_bdevs_list": [ 00:11:44.829 { 00:11:44.829 "name": "BaseBdev1", 00:11:44.829 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:44.829 "is_configured": true, 00:11:44.829 "data_offset": 0, 00:11:44.829 "data_size": 65536 00:11:44.829 }, 00:11:44.829 { 00:11:44.829 "name": "BaseBdev2", 00:11:44.829 "uuid": "4e21c5e7-b373-41b0-af6c-470926116a67", 00:11:44.829 "is_configured": true, 00:11:44.829 "data_offset": 0, 00:11:44.829 "data_size": 65536 00:11:44.829 } 00:11:44.829 ] 00:11:44.829 }' 00:11:44.829 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.829 10:07:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.401 10:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:45.401 [2024-06-10 10:07:07.165850] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:45.401 "name": "Existed_Raid", 00:11:45.401 "aliases": [ 00:11:45.401 "188ca0c7-2e1c-4606-89e8-1c33925fda2f" 00:11:45.401 ], 00:11:45.401 "product_name": "Raid Volume", 00:11:45.401 "block_size": 512, 00:11:45.401 "num_blocks": 65536, 00:11:45.401 "uuid": "188ca0c7-2e1c-4606-89e8-1c33925fda2f", 00:11:45.401 "assigned_rate_limits": { 00:11:45.401 "rw_ios_per_sec": 0, 00:11:45.401 "rw_mbytes_per_sec": 0, 00:11:45.401 "r_mbytes_per_sec": 0, 00:11:45.401 "w_mbytes_per_sec": 0 00:11:45.401 }, 00:11:45.401 "claimed": false, 00:11:45.401 "zoned": false, 00:11:45.401 "supported_io_types": { 00:11:45.401 "read": true, 00:11:45.401 "write": true, 00:11:45.401 "unmap": false, 00:11:45.401 "write_zeroes": true, 00:11:45.401 "flush": false, 00:11:45.401 "reset": true, 00:11:45.401 "compare": false, 00:11:45.401 "compare_and_write": false, 00:11:45.401 "abort": false, 00:11:45.401 "nvme_admin": false, 00:11:45.401 "nvme_io": false 00:11:45.401 }, 00:11:45.401 "memory_domains": [ 00:11:45.401 { 00:11:45.401 "dma_device_id": "system", 00:11:45.401 "dma_device_type": 1 00:11:45.401 }, 00:11:45.401 { 00:11:45.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.401 "dma_device_type": 2 00:11:45.401 }, 00:11:45.401 { 00:11:45.401 "dma_device_id": "system", 00:11:45.401 "dma_device_type": 1 00:11:45.401 }, 00:11:45.401 { 00:11:45.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.401 "dma_device_type": 2 00:11:45.401 } 00:11:45.401 ], 00:11:45.401 "driver_specific": { 00:11:45.401 "raid": { 00:11:45.401 "uuid": "188ca0c7-2e1c-4606-89e8-1c33925fda2f", 00:11:45.401 "strip_size_kb": 0, 00:11:45.401 "state": "online", 00:11:45.401 "raid_level": "raid1", 00:11:45.401 "superblock": false, 00:11:45.401 "num_base_bdevs": 2, 00:11:45.401 "num_base_bdevs_discovered": 2, 00:11:45.401 "num_base_bdevs_operational": 2, 00:11:45.401 "base_bdevs_list": [ 00:11:45.401 { 00:11:45.401 "name": "BaseBdev1", 00:11:45.401 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:45.401 "is_configured": true, 00:11:45.401 "data_offset": 0, 00:11:45.401 "data_size": 65536 00:11:45.401 }, 00:11:45.401 { 00:11:45.401 "name": "BaseBdev2", 00:11:45.401 "uuid": "4e21c5e7-b373-41b0-af6c-470926116a67", 00:11:45.401 "is_configured": true, 00:11:45.401 "data_offset": 0, 00:11:45.401 "data_size": 65536 00:11:45.401 } 00:11:45.401 ] 00:11:45.401 } 00:11:45.401 } 00:11:45.401 }' 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:45.401 BaseBdev2' 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:45.401 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:45.660 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:45.660 "name": "BaseBdev1", 00:11:45.660 "aliases": [ 00:11:45.660 "9e8ee540-0a32-4690-8910-b2202b89737d" 00:11:45.660 ], 00:11:45.660 "product_name": "Malloc disk", 00:11:45.660 "block_size": 512, 00:11:45.660 "num_blocks": 65536, 00:11:45.660 "uuid": "9e8ee540-0a32-4690-8910-b2202b89737d", 00:11:45.660 "assigned_rate_limits": { 00:11:45.660 "rw_ios_per_sec": 0, 00:11:45.660 "rw_mbytes_per_sec": 0, 00:11:45.660 "r_mbytes_per_sec": 0, 00:11:45.660 "w_mbytes_per_sec": 0 00:11:45.660 }, 00:11:45.660 "claimed": true, 00:11:45.660 "claim_type": "exclusive_write", 00:11:45.660 "zoned": false, 00:11:45.660 "supported_io_types": { 00:11:45.660 "read": true, 00:11:45.660 "write": true, 00:11:45.660 "unmap": true, 00:11:45.660 "write_zeroes": true, 00:11:45.661 "flush": true, 00:11:45.661 "reset": true, 00:11:45.661 "compare": false, 00:11:45.661 "compare_and_write": false, 00:11:45.661 "abort": true, 00:11:45.661 "nvme_admin": false, 00:11:45.661 "nvme_io": false 00:11:45.661 }, 00:11:45.661 "memory_domains": [ 00:11:45.661 { 00:11:45.661 "dma_device_id": "system", 00:11:45.661 "dma_device_type": 1 00:11:45.661 }, 00:11:45.661 { 00:11:45.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.661 "dma_device_type": 2 00:11:45.661 } 00:11:45.661 ], 00:11:45.661 "driver_specific": {} 00:11:45.661 }' 00:11:45.661 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.661 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.661 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:45.661 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:45.920 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.180 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:46.180 "name": "BaseBdev2", 00:11:46.180 "aliases": [ 00:11:46.180 "4e21c5e7-b373-41b0-af6c-470926116a67" 00:11:46.180 ], 00:11:46.180 "product_name": "Malloc disk", 00:11:46.180 "block_size": 512, 00:11:46.180 "num_blocks": 65536, 00:11:46.180 "uuid": "4e21c5e7-b373-41b0-af6c-470926116a67", 00:11:46.180 "assigned_rate_limits": { 00:11:46.180 "rw_ios_per_sec": 0, 00:11:46.180 "rw_mbytes_per_sec": 0, 00:11:46.180 "r_mbytes_per_sec": 0, 00:11:46.180 "w_mbytes_per_sec": 0 00:11:46.180 }, 00:11:46.180 "claimed": true, 00:11:46.180 "claim_type": "exclusive_write", 00:11:46.180 "zoned": false, 00:11:46.180 "supported_io_types": { 00:11:46.180 "read": true, 00:11:46.180 "write": true, 00:11:46.180 "unmap": true, 00:11:46.180 "write_zeroes": true, 00:11:46.180 "flush": true, 00:11:46.180 "reset": true, 00:11:46.180 "compare": false, 00:11:46.180 "compare_and_write": false, 00:11:46.180 "abort": true, 00:11:46.180 "nvme_admin": false, 00:11:46.180 "nvme_io": false 00:11:46.180 }, 00:11:46.180 "memory_domains": [ 00:11:46.180 { 00:11:46.180 "dma_device_id": "system", 00:11:46.180 "dma_device_type": 1 00:11:46.180 }, 00:11:46.180 { 00:11:46.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.180 "dma_device_type": 2 00:11:46.180 } 00:11:46.180 ], 00:11:46.180 "driver_specific": {} 00:11:46.180 }' 00:11:46.180 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.180 10:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.180 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:46.180 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:46.440 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:46.700 [2024-06-10 10:07:08.468988] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.701 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.961 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:46.961 "name": "Existed_Raid", 00:11:46.961 "uuid": "188ca0c7-2e1c-4606-89e8-1c33925fda2f", 00:11:46.961 "strip_size_kb": 0, 00:11:46.961 "state": "online", 00:11:46.961 "raid_level": "raid1", 00:11:46.961 "superblock": false, 00:11:46.961 "num_base_bdevs": 2, 00:11:46.961 "num_base_bdevs_discovered": 1, 00:11:46.961 "num_base_bdevs_operational": 1, 00:11:46.961 "base_bdevs_list": [ 00:11:46.961 { 00:11:46.961 "name": null, 00:11:46.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:46.961 "is_configured": false, 00:11:46.961 "data_offset": 0, 00:11:46.961 "data_size": 65536 00:11:46.961 }, 00:11:46.961 { 00:11:46.961 "name": "BaseBdev2", 00:11:46.961 "uuid": "4e21c5e7-b373-41b0-af6c-470926116a67", 00:11:46.961 "is_configured": true, 00:11:46.961 "data_offset": 0, 00:11:46.961 "data_size": 65536 00:11:46.961 } 00:11:46.961 ] 00:11:46.961 }' 00:11:46.961 10:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:46.961 10:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:47.531 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:47.791 [2024-06-10 10:07:09.559786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:47.791 [2024-06-10 10:07:09.559861] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:47.791 [2024-06-10 10:07:09.565866] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:47.791 [2024-06-10 10:07:09.565892] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:47.791 [2024-06-10 10:07:09.565898] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce71c0 name Existed_Raid, state offline 00:11:47.791 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:47.791 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:47.791 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.791 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 970865 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 970865 ']' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 970865 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 970865 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 970865' 00:11:48.051 killing process with pid 970865 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 970865 00:11:48.051 [2024-06-10 10:07:09.825283] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:48.051 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 970865 00:11:48.051 [2024-06-10 10:07:09.825870] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:48.312 10:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:48.312 00:11:48.312 real 0m8.759s 00:11:48.312 user 0m15.925s 00:11:48.312 sys 0m1.311s 00:11:48.312 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:48.312 10:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.312 ************************************ 00:11:48.312 END TEST raid_state_function_test 00:11:48.312 ************************************ 00:11:48.312 10:07:09 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:48.312 10:07:09 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:48.312 10:07:09 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:48.312 10:07:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:48.312 ************************************ 00:11:48.312 START TEST raid_state_function_test_sb 00:11:48.312 ************************************ 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=972622 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 972622' 00:11:48.312 Process raid pid: 972622 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 972622 /var/tmp/spdk-raid.sock 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 972622 ']' 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:48.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:48.312 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.312 [2024-06-10 10:07:10.094027] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:48.312 [2024-06-10 10:07:10.094084] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:48.572 [2024-06-10 10:07:10.184035] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:48.572 [2024-06-10 10:07:10.248494] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:48.572 [2024-06-10 10:07:10.295863] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.572 [2024-06-10 10:07:10.295886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:49.142 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:49.142 10:07:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:11:49.142 10:07:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:49.402 [2024-06-10 10:07:11.079003] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:49.402 [2024-06-10 10:07:11.079033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:49.402 [2024-06-10 10:07:11.079039] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:49.402 [2024-06-10 10:07:11.079045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.402 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.662 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.662 "name": "Existed_Raid", 00:11:49.662 "uuid": "1ef39b5f-3700-4592-9a4d-5d9ba3e045f3", 00:11:49.662 "strip_size_kb": 0, 00:11:49.662 "state": "configuring", 00:11:49.662 "raid_level": "raid1", 00:11:49.662 "superblock": true, 00:11:49.662 "num_base_bdevs": 2, 00:11:49.662 "num_base_bdevs_discovered": 0, 00:11:49.662 "num_base_bdevs_operational": 2, 00:11:49.662 "base_bdevs_list": [ 00:11:49.662 { 00:11:49.662 "name": "BaseBdev1", 00:11:49.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.662 "is_configured": false, 00:11:49.662 "data_offset": 0, 00:11:49.662 "data_size": 0 00:11:49.662 }, 00:11:49.662 { 00:11:49.662 "name": "BaseBdev2", 00:11:49.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.662 "is_configured": false, 00:11:49.662 "data_offset": 0, 00:11:49.662 "data_size": 0 00:11:49.662 } 00:11:49.662 ] 00:11:49.662 }' 00:11:49.662 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.662 10:07:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.232 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:50.232 [2024-06-10 10:07:11.977165] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:50.232 [2024-06-10 10:07:11.977180] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaabb00 name Existed_Raid, state configuring 00:11:50.232 10:07:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:50.490 [2024-06-10 10:07:12.169668] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:50.490 [2024-06-10 10:07:12.169690] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:50.491 [2024-06-10 10:07:12.169696] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:50.491 [2024-06-10 10:07:12.169701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:50.491 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:50.750 [2024-06-10 10:07:12.364577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:50.750 BaseBdev1 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:50.750 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:51.008 [ 00:11:51.008 { 00:11:51.008 "name": "BaseBdev1", 00:11:51.008 "aliases": [ 00:11:51.008 "1de73edf-e2ae-4072-9a2b-2b0b1bfca754" 00:11:51.008 ], 00:11:51.008 "product_name": "Malloc disk", 00:11:51.008 "block_size": 512, 00:11:51.008 "num_blocks": 65536, 00:11:51.008 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:51.008 "assigned_rate_limits": { 00:11:51.008 "rw_ios_per_sec": 0, 00:11:51.008 "rw_mbytes_per_sec": 0, 00:11:51.008 "r_mbytes_per_sec": 0, 00:11:51.008 "w_mbytes_per_sec": 0 00:11:51.008 }, 00:11:51.008 "claimed": true, 00:11:51.008 "claim_type": "exclusive_write", 00:11:51.008 "zoned": false, 00:11:51.008 "supported_io_types": { 00:11:51.008 "read": true, 00:11:51.008 "write": true, 00:11:51.008 "unmap": true, 00:11:51.008 "write_zeroes": true, 00:11:51.008 "flush": true, 00:11:51.008 "reset": true, 00:11:51.008 "compare": false, 00:11:51.008 "compare_and_write": false, 00:11:51.008 "abort": true, 00:11:51.008 "nvme_admin": false, 00:11:51.008 "nvme_io": false 00:11:51.008 }, 00:11:51.008 "memory_domains": [ 00:11:51.008 { 00:11:51.008 "dma_device_id": "system", 00:11:51.008 "dma_device_type": 1 00:11:51.008 }, 00:11:51.008 { 00:11:51.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.008 "dma_device_type": 2 00:11:51.008 } 00:11:51.008 ], 00:11:51.008 "driver_specific": {} 00:11:51.008 } 00:11:51.009 ] 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.009 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:51.268 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.268 "name": "Existed_Raid", 00:11:51.268 "uuid": "b7194385-a3bc-454a-a444-00773840441c", 00:11:51.268 "strip_size_kb": 0, 00:11:51.268 "state": "configuring", 00:11:51.268 "raid_level": "raid1", 00:11:51.268 "superblock": true, 00:11:51.268 "num_base_bdevs": 2, 00:11:51.268 "num_base_bdevs_discovered": 1, 00:11:51.268 "num_base_bdevs_operational": 2, 00:11:51.268 "base_bdevs_list": [ 00:11:51.268 { 00:11:51.268 "name": "BaseBdev1", 00:11:51.268 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:51.268 "is_configured": true, 00:11:51.268 "data_offset": 2048, 00:11:51.268 "data_size": 63488 00:11:51.268 }, 00:11:51.268 { 00:11:51.268 "name": "BaseBdev2", 00:11:51.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:51.268 "is_configured": false, 00:11:51.268 "data_offset": 0, 00:11:51.268 "data_size": 0 00:11:51.268 } 00:11:51.268 ] 00:11:51.268 }' 00:11:51.268 10:07:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.268 10:07:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.839 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:51.839 [2024-06-10 10:07:13.635772] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:51.839 [2024-06-10 10:07:13.635801] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaab3f0 name Existed_Raid, state configuring 00:11:51.839 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:52.100 [2024-06-10 10:07:13.828290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:52.100 [2024-06-10 10:07:13.829440] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:52.100 [2024-06-10 10:07:13.829466] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.100 10:07:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.359 10:07:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.359 "name": "Existed_Raid", 00:11:52.359 "uuid": "aecb95e7-6c6e-4a41-9130-0aee7d1ac492", 00:11:52.359 "strip_size_kb": 0, 00:11:52.359 "state": "configuring", 00:11:52.359 "raid_level": "raid1", 00:11:52.359 "superblock": true, 00:11:52.359 "num_base_bdevs": 2, 00:11:52.359 "num_base_bdevs_discovered": 1, 00:11:52.359 "num_base_bdevs_operational": 2, 00:11:52.359 "base_bdevs_list": [ 00:11:52.359 { 00:11:52.359 "name": "BaseBdev1", 00:11:52.359 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:52.359 "is_configured": true, 00:11:52.359 "data_offset": 2048, 00:11:52.359 "data_size": 63488 00:11:52.359 }, 00:11:52.359 { 00:11:52.359 "name": "BaseBdev2", 00:11:52.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.359 "is_configured": false, 00:11:52.359 "data_offset": 0, 00:11:52.360 "data_size": 0 00:11:52.360 } 00:11:52.360 ] 00:11:52.360 }' 00:11:52.360 10:07:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.360 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:52.931 [2024-06-10 10:07:14.739306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:52.931 [2024-06-10 10:07:14.739414] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaac1c0 00:11:52.931 [2024-06-10 10:07:14.739422] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:52.931 [2024-06-10 10:07:14.739565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc5f220 00:11:52.931 [2024-06-10 10:07:14.739656] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaac1c0 00:11:52.931 [2024-06-10 10:07:14.739662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaac1c0 00:11:52.931 [2024-06-10 10:07:14.739733] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:52.931 BaseBdev2 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:52.931 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:53.192 10:07:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:53.454 [ 00:11:53.454 { 00:11:53.454 "name": "BaseBdev2", 00:11:53.454 "aliases": [ 00:11:53.454 "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3" 00:11:53.454 ], 00:11:53.454 "product_name": "Malloc disk", 00:11:53.454 "block_size": 512, 00:11:53.454 "num_blocks": 65536, 00:11:53.454 "uuid": "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3", 00:11:53.454 "assigned_rate_limits": { 00:11:53.454 "rw_ios_per_sec": 0, 00:11:53.454 "rw_mbytes_per_sec": 0, 00:11:53.454 "r_mbytes_per_sec": 0, 00:11:53.454 "w_mbytes_per_sec": 0 00:11:53.454 }, 00:11:53.454 "claimed": true, 00:11:53.454 "claim_type": "exclusive_write", 00:11:53.454 "zoned": false, 00:11:53.454 "supported_io_types": { 00:11:53.454 "read": true, 00:11:53.454 "write": true, 00:11:53.454 "unmap": true, 00:11:53.454 "write_zeroes": true, 00:11:53.454 "flush": true, 00:11:53.454 "reset": true, 00:11:53.454 "compare": false, 00:11:53.454 "compare_and_write": false, 00:11:53.454 "abort": true, 00:11:53.454 "nvme_admin": false, 00:11:53.454 "nvme_io": false 00:11:53.454 }, 00:11:53.454 "memory_domains": [ 00:11:53.454 { 00:11:53.454 "dma_device_id": "system", 00:11:53.454 "dma_device_type": 1 00:11:53.454 }, 00:11:53.454 { 00:11:53.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.454 "dma_device_type": 2 00:11:53.454 } 00:11:53.454 ], 00:11:53.454 "driver_specific": {} 00:11:53.454 } 00:11:53.454 ] 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.454 "name": "Existed_Raid", 00:11:53.454 "uuid": "aecb95e7-6c6e-4a41-9130-0aee7d1ac492", 00:11:53.454 "strip_size_kb": 0, 00:11:53.454 "state": "online", 00:11:53.454 "raid_level": "raid1", 00:11:53.454 "superblock": true, 00:11:53.454 "num_base_bdevs": 2, 00:11:53.454 "num_base_bdevs_discovered": 2, 00:11:53.454 "num_base_bdevs_operational": 2, 00:11:53.454 "base_bdevs_list": [ 00:11:53.454 { 00:11:53.454 "name": "BaseBdev1", 00:11:53.454 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:53.454 "is_configured": true, 00:11:53.454 "data_offset": 2048, 00:11:53.454 "data_size": 63488 00:11:53.454 }, 00:11:53.454 { 00:11:53.454 "name": "BaseBdev2", 00:11:53.454 "uuid": "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3", 00:11:53.454 "is_configured": true, 00:11:53.454 "data_offset": 2048, 00:11:53.454 "data_size": 63488 00:11:53.454 } 00:11:53.454 ] 00:11:53.454 }' 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.454 10:07:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:54.025 10:07:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:54.286 [2024-06-10 10:07:16.030751] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:54.286 "name": "Existed_Raid", 00:11:54.286 "aliases": [ 00:11:54.286 "aecb95e7-6c6e-4a41-9130-0aee7d1ac492" 00:11:54.286 ], 00:11:54.286 "product_name": "Raid Volume", 00:11:54.286 "block_size": 512, 00:11:54.286 "num_blocks": 63488, 00:11:54.286 "uuid": "aecb95e7-6c6e-4a41-9130-0aee7d1ac492", 00:11:54.286 "assigned_rate_limits": { 00:11:54.286 "rw_ios_per_sec": 0, 00:11:54.286 "rw_mbytes_per_sec": 0, 00:11:54.286 "r_mbytes_per_sec": 0, 00:11:54.286 "w_mbytes_per_sec": 0 00:11:54.286 }, 00:11:54.286 "claimed": false, 00:11:54.286 "zoned": false, 00:11:54.286 "supported_io_types": { 00:11:54.286 "read": true, 00:11:54.286 "write": true, 00:11:54.286 "unmap": false, 00:11:54.286 "write_zeroes": true, 00:11:54.286 "flush": false, 00:11:54.286 "reset": true, 00:11:54.286 "compare": false, 00:11:54.286 "compare_and_write": false, 00:11:54.286 "abort": false, 00:11:54.286 "nvme_admin": false, 00:11:54.286 "nvme_io": false 00:11:54.286 }, 00:11:54.286 "memory_domains": [ 00:11:54.286 { 00:11:54.286 "dma_device_id": "system", 00:11:54.286 "dma_device_type": 1 00:11:54.286 }, 00:11:54.286 { 00:11:54.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.286 "dma_device_type": 2 00:11:54.286 }, 00:11:54.286 { 00:11:54.286 "dma_device_id": "system", 00:11:54.286 "dma_device_type": 1 00:11:54.286 }, 00:11:54.286 { 00:11:54.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.286 "dma_device_type": 2 00:11:54.286 } 00:11:54.286 ], 00:11:54.286 "driver_specific": { 00:11:54.286 "raid": { 00:11:54.286 "uuid": "aecb95e7-6c6e-4a41-9130-0aee7d1ac492", 00:11:54.286 "strip_size_kb": 0, 00:11:54.286 "state": "online", 00:11:54.286 "raid_level": "raid1", 00:11:54.286 "superblock": true, 00:11:54.286 "num_base_bdevs": 2, 00:11:54.286 "num_base_bdevs_discovered": 2, 00:11:54.286 "num_base_bdevs_operational": 2, 00:11:54.286 "base_bdevs_list": [ 00:11:54.286 { 00:11:54.286 "name": "BaseBdev1", 00:11:54.286 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:54.286 "is_configured": true, 00:11:54.286 "data_offset": 2048, 00:11:54.286 "data_size": 63488 00:11:54.286 }, 00:11:54.286 { 00:11:54.286 "name": "BaseBdev2", 00:11:54.286 "uuid": "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3", 00:11:54.286 "is_configured": true, 00:11:54.286 "data_offset": 2048, 00:11:54.286 "data_size": 63488 00:11:54.286 } 00:11:54.286 ] 00:11:54.286 } 00:11:54.286 } 00:11:54.286 }' 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:54.286 BaseBdev2' 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:54.286 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:54.547 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:54.547 "name": "BaseBdev1", 00:11:54.547 "aliases": [ 00:11:54.547 "1de73edf-e2ae-4072-9a2b-2b0b1bfca754" 00:11:54.547 ], 00:11:54.547 "product_name": "Malloc disk", 00:11:54.547 "block_size": 512, 00:11:54.547 "num_blocks": 65536, 00:11:54.547 "uuid": "1de73edf-e2ae-4072-9a2b-2b0b1bfca754", 00:11:54.547 "assigned_rate_limits": { 00:11:54.547 "rw_ios_per_sec": 0, 00:11:54.547 "rw_mbytes_per_sec": 0, 00:11:54.547 "r_mbytes_per_sec": 0, 00:11:54.547 "w_mbytes_per_sec": 0 00:11:54.547 }, 00:11:54.547 "claimed": true, 00:11:54.547 "claim_type": "exclusive_write", 00:11:54.547 "zoned": false, 00:11:54.547 "supported_io_types": { 00:11:54.547 "read": true, 00:11:54.547 "write": true, 00:11:54.547 "unmap": true, 00:11:54.547 "write_zeroes": true, 00:11:54.547 "flush": true, 00:11:54.547 "reset": true, 00:11:54.547 "compare": false, 00:11:54.547 "compare_and_write": false, 00:11:54.547 "abort": true, 00:11:54.547 "nvme_admin": false, 00:11:54.547 "nvme_io": false 00:11:54.547 }, 00:11:54.547 "memory_domains": [ 00:11:54.547 { 00:11:54.547 "dma_device_id": "system", 00:11:54.547 "dma_device_type": 1 00:11:54.547 }, 00:11:54.547 { 00:11:54.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.547 "dma_device_type": 2 00:11:54.547 } 00:11:54.547 ], 00:11:54.547 "driver_specific": {} 00:11:54.547 }' 00:11:54.547 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.547 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.547 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:54.547 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:54.808 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:55.069 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:55.069 "name": "BaseBdev2", 00:11:55.069 "aliases": [ 00:11:55.069 "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3" 00:11:55.069 ], 00:11:55.069 "product_name": "Malloc disk", 00:11:55.069 "block_size": 512, 00:11:55.069 "num_blocks": 65536, 00:11:55.069 "uuid": "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3", 00:11:55.069 "assigned_rate_limits": { 00:11:55.069 "rw_ios_per_sec": 0, 00:11:55.069 "rw_mbytes_per_sec": 0, 00:11:55.069 "r_mbytes_per_sec": 0, 00:11:55.069 "w_mbytes_per_sec": 0 00:11:55.069 }, 00:11:55.069 "claimed": true, 00:11:55.070 "claim_type": "exclusive_write", 00:11:55.070 "zoned": false, 00:11:55.070 "supported_io_types": { 00:11:55.070 "read": true, 00:11:55.070 "write": true, 00:11:55.070 "unmap": true, 00:11:55.070 "write_zeroes": true, 00:11:55.070 "flush": true, 00:11:55.070 "reset": true, 00:11:55.070 "compare": false, 00:11:55.070 "compare_and_write": false, 00:11:55.070 "abort": true, 00:11:55.070 "nvme_admin": false, 00:11:55.070 "nvme_io": false 00:11:55.070 }, 00:11:55.070 "memory_domains": [ 00:11:55.070 { 00:11:55.070 "dma_device_id": "system", 00:11:55.070 "dma_device_type": 1 00:11:55.070 }, 00:11:55.070 { 00:11:55.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.070 "dma_device_type": 2 00:11:55.070 } 00:11:55.070 ], 00:11:55.070 "driver_specific": {} 00:11:55.070 }' 00:11:55.070 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.070 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.070 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:55.070 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.331 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.331 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:55.331 10:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:55.331 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:55.591 [2024-06-10 10:07:17.345987] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.591 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.851 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.851 "name": "Existed_Raid", 00:11:55.851 "uuid": "aecb95e7-6c6e-4a41-9130-0aee7d1ac492", 00:11:55.851 "strip_size_kb": 0, 00:11:55.851 "state": "online", 00:11:55.851 "raid_level": "raid1", 00:11:55.851 "superblock": true, 00:11:55.851 "num_base_bdevs": 2, 00:11:55.851 "num_base_bdevs_discovered": 1, 00:11:55.851 "num_base_bdevs_operational": 1, 00:11:55.851 "base_bdevs_list": [ 00:11:55.851 { 00:11:55.851 "name": null, 00:11:55.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.851 "is_configured": false, 00:11:55.851 "data_offset": 2048, 00:11:55.851 "data_size": 63488 00:11:55.851 }, 00:11:55.851 { 00:11:55.851 "name": "BaseBdev2", 00:11:55.851 "uuid": "f2dae244-7e98-42e7-aceb-8b97f5a3a6c3", 00:11:55.851 "is_configured": true, 00:11:55.851 "data_offset": 2048, 00:11:55.851 "data_size": 63488 00:11:55.851 } 00:11:55.851 ] 00:11:55.851 }' 00:11:55.851 10:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.851 10:07:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:56.421 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:56.681 [2024-06-10 10:07:18.456901] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:56.681 [2024-06-10 10:07:18.456967] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:56.681 [2024-06-10 10:07:18.463031] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:56.681 [2024-06-10 10:07:18.463055] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:56.681 [2024-06-10 10:07:18.463061] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaac1c0 name Existed_Raid, state offline 00:11:56.681 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:56.681 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:56.681 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.681 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 972622 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 972622 ']' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 972622 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 972622 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 972622' 00:11:56.941 killing process with pid 972622 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 972622 00:11:56.941 [2024-06-10 10:07:18.722317] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.941 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 972622 00:11:56.941 [2024-06-10 10:07:18.722917] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:57.201 10:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:57.201 00:11:57.202 real 0m8.817s 00:11:57.202 user 0m15.963s 00:11:57.202 sys 0m1.371s 00:11:57.202 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:57.202 10:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:57.202 ************************************ 00:11:57.202 END TEST raid_state_function_test_sb 00:11:57.202 ************************************ 00:11:57.202 10:07:18 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:11:57.202 10:07:18 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:11:57.202 10:07:18 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:57.202 10:07:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:57.202 ************************************ 00:11:57.202 START TEST raid_superblock_test 00:11:57.202 ************************************ 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=974347 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 974347 /var/tmp/spdk-raid.sock 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 974347 ']' 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:57.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:57.202 10:07:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.202 [2024-06-10 10:07:18.977298] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:11:57.202 [2024-06-10 10:07:18.977348] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid974347 ] 00:11:57.462 [2024-06-10 10:07:19.067467] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:57.462 [2024-06-10 10:07:19.133510] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.462 [2024-06-10 10:07:19.175137] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.462 [2024-06-10 10:07:19.175161] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:58.033 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:58.293 malloc1 00:11:58.293 10:07:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:58.553 [2024-06-10 10:07:20.165113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:58.553 [2024-06-10 10:07:20.165152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.553 [2024-06-10 10:07:20.165164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14e3990 00:11:58.553 [2024-06-10 10:07:20.165170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.553 [2024-06-10 10:07:20.166520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.553 [2024-06-10 10:07:20.166541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:58.553 pt1 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:58.553 malloc2 00:11:58.553 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:58.813 [2024-06-10 10:07:20.548019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:58.813 [2024-06-10 10:07:20.548049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.813 [2024-06-10 10:07:20.548060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14e44e0 00:11:58.813 [2024-06-10 10:07:20.548066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.813 [2024-06-10 10:07:20.549274] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.813 [2024-06-10 10:07:20.549293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:58.813 pt2 00:11:58.813 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:58.813 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.813 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:59.074 [2024-06-10 10:07:20.740512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:59.074 [2024-06-10 10:07:20.741509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:59.074 [2024-06-10 10:07:20.741619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x168cbc0 00:11:59.074 [2024-06-10 10:07:20.741627] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:59.074 [2024-06-10 10:07:20.741772] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14e2850 00:11:59.074 [2024-06-10 10:07:20.741889] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x168cbc0 00:11:59.074 [2024-06-10 10:07:20.741895] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x168cbc0 00:11:59.074 [2024-06-10 10:07:20.741963] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.074 "name": "raid_bdev1", 00:11:59.074 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:11:59.074 "strip_size_kb": 0, 00:11:59.074 "state": "online", 00:11:59.074 "raid_level": "raid1", 00:11:59.074 "superblock": true, 00:11:59.074 "num_base_bdevs": 2, 00:11:59.074 "num_base_bdevs_discovered": 2, 00:11:59.074 "num_base_bdevs_operational": 2, 00:11:59.074 "base_bdevs_list": [ 00:11:59.074 { 00:11:59.074 "name": "pt1", 00:11:59.074 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.074 "is_configured": true, 00:11:59.074 "data_offset": 2048, 00:11:59.074 "data_size": 63488 00:11:59.074 }, 00:11:59.074 { 00:11:59.074 "name": "pt2", 00:11:59.074 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.074 "is_configured": true, 00:11:59.074 "data_offset": 2048, 00:11:59.074 "data_size": 63488 00:11:59.074 } 00:11:59.074 ] 00:11:59.074 }' 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.074 10:07:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:59.644 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:59.904 [2024-06-10 10:07:21.667016] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:59.904 "name": "raid_bdev1", 00:11:59.904 "aliases": [ 00:11:59.904 "29d77040-5fb4-4bb9-8af7-d7b81d9b0531" 00:11:59.904 ], 00:11:59.904 "product_name": "Raid Volume", 00:11:59.904 "block_size": 512, 00:11:59.904 "num_blocks": 63488, 00:11:59.904 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:11:59.904 "assigned_rate_limits": { 00:11:59.904 "rw_ios_per_sec": 0, 00:11:59.904 "rw_mbytes_per_sec": 0, 00:11:59.904 "r_mbytes_per_sec": 0, 00:11:59.904 "w_mbytes_per_sec": 0 00:11:59.904 }, 00:11:59.904 "claimed": false, 00:11:59.904 "zoned": false, 00:11:59.904 "supported_io_types": { 00:11:59.904 "read": true, 00:11:59.904 "write": true, 00:11:59.904 "unmap": false, 00:11:59.904 "write_zeroes": true, 00:11:59.904 "flush": false, 00:11:59.904 "reset": true, 00:11:59.904 "compare": false, 00:11:59.904 "compare_and_write": false, 00:11:59.904 "abort": false, 00:11:59.904 "nvme_admin": false, 00:11:59.904 "nvme_io": false 00:11:59.904 }, 00:11:59.904 "memory_domains": [ 00:11:59.904 { 00:11:59.904 "dma_device_id": "system", 00:11:59.904 "dma_device_type": 1 00:11:59.904 }, 00:11:59.904 { 00:11:59.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.904 "dma_device_type": 2 00:11:59.904 }, 00:11:59.904 { 00:11:59.904 "dma_device_id": "system", 00:11:59.904 "dma_device_type": 1 00:11:59.904 }, 00:11:59.904 { 00:11:59.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.904 "dma_device_type": 2 00:11:59.904 } 00:11:59.904 ], 00:11:59.904 "driver_specific": { 00:11:59.904 "raid": { 00:11:59.904 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:11:59.904 "strip_size_kb": 0, 00:11:59.904 "state": "online", 00:11:59.904 "raid_level": "raid1", 00:11:59.904 "superblock": true, 00:11:59.904 "num_base_bdevs": 2, 00:11:59.904 "num_base_bdevs_discovered": 2, 00:11:59.904 "num_base_bdevs_operational": 2, 00:11:59.904 "base_bdevs_list": [ 00:11:59.904 { 00:11:59.904 "name": "pt1", 00:11:59.904 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.904 "is_configured": true, 00:11:59.904 "data_offset": 2048, 00:11:59.904 "data_size": 63488 00:11:59.904 }, 00:11:59.904 { 00:11:59.904 "name": "pt2", 00:11:59.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.904 "is_configured": true, 00:11:59.904 "data_offset": 2048, 00:11:59.904 "data_size": 63488 00:11:59.904 } 00:11:59.904 ] 00:11:59.904 } 00:11:59.904 } 00:11:59.904 }' 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:59.904 pt2' 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:59.904 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.164 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.164 "name": "pt1", 00:12:00.164 "aliases": [ 00:12:00.164 "00000000-0000-0000-0000-000000000001" 00:12:00.164 ], 00:12:00.164 "product_name": "passthru", 00:12:00.164 "block_size": 512, 00:12:00.164 "num_blocks": 65536, 00:12:00.164 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:00.164 "assigned_rate_limits": { 00:12:00.164 "rw_ios_per_sec": 0, 00:12:00.164 "rw_mbytes_per_sec": 0, 00:12:00.164 "r_mbytes_per_sec": 0, 00:12:00.164 "w_mbytes_per_sec": 0 00:12:00.164 }, 00:12:00.164 "claimed": true, 00:12:00.164 "claim_type": "exclusive_write", 00:12:00.164 "zoned": false, 00:12:00.164 "supported_io_types": { 00:12:00.164 "read": true, 00:12:00.164 "write": true, 00:12:00.164 "unmap": true, 00:12:00.164 "write_zeroes": true, 00:12:00.164 "flush": true, 00:12:00.164 "reset": true, 00:12:00.164 "compare": false, 00:12:00.164 "compare_and_write": false, 00:12:00.165 "abort": true, 00:12:00.165 "nvme_admin": false, 00:12:00.165 "nvme_io": false 00:12:00.165 }, 00:12:00.165 "memory_domains": [ 00:12:00.165 { 00:12:00.165 "dma_device_id": "system", 00:12:00.165 "dma_device_type": 1 00:12:00.165 }, 00:12:00.165 { 00:12:00.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.165 "dma_device_type": 2 00:12:00.165 } 00:12:00.165 ], 00:12:00.165 "driver_specific": { 00:12:00.165 "passthru": { 00:12:00.165 "name": "pt1", 00:12:00.165 "base_bdev_name": "malloc1" 00:12:00.165 } 00:12:00.165 } 00:12:00.165 }' 00:12:00.165 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.165 10:07:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.165 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.165 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:00.426 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.686 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.686 "name": "pt2", 00:12:00.686 "aliases": [ 00:12:00.686 "00000000-0000-0000-0000-000000000002" 00:12:00.686 ], 00:12:00.686 "product_name": "passthru", 00:12:00.686 "block_size": 512, 00:12:00.686 "num_blocks": 65536, 00:12:00.686 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:00.686 "assigned_rate_limits": { 00:12:00.686 "rw_ios_per_sec": 0, 00:12:00.686 "rw_mbytes_per_sec": 0, 00:12:00.686 "r_mbytes_per_sec": 0, 00:12:00.686 "w_mbytes_per_sec": 0 00:12:00.686 }, 00:12:00.686 "claimed": true, 00:12:00.686 "claim_type": "exclusive_write", 00:12:00.686 "zoned": false, 00:12:00.686 "supported_io_types": { 00:12:00.686 "read": true, 00:12:00.686 "write": true, 00:12:00.686 "unmap": true, 00:12:00.686 "write_zeroes": true, 00:12:00.686 "flush": true, 00:12:00.686 "reset": true, 00:12:00.686 "compare": false, 00:12:00.686 "compare_and_write": false, 00:12:00.686 "abort": true, 00:12:00.686 "nvme_admin": false, 00:12:00.686 "nvme_io": false 00:12:00.686 }, 00:12:00.686 "memory_domains": [ 00:12:00.686 { 00:12:00.686 "dma_device_id": "system", 00:12:00.686 "dma_device_type": 1 00:12:00.686 }, 00:12:00.686 { 00:12:00.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.686 "dma_device_type": 2 00:12:00.686 } 00:12:00.686 ], 00:12:00.686 "driver_specific": { 00:12:00.686 "passthru": { 00:12:00.686 "name": "pt2", 00:12:00.686 "base_bdev_name": "malloc2" 00:12:00.686 } 00:12:00.686 } 00:12:00.686 }' 00:12:00.686 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.687 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.687 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.687 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:00.946 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:01.206 [2024-06-10 10:07:22.966309] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:01.207 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=29d77040-5fb4-4bb9-8af7-d7b81d9b0531 00:12:01.207 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 29d77040-5fb4-4bb9-8af7-d7b81d9b0531 ']' 00:12:01.207 10:07:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:01.467 [2024-06-10 10:07:23.142586] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:01.467 [2024-06-10 10:07:23.142596] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:01.467 [2024-06-10 10:07:23.142632] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:01.467 [2024-06-10 10:07:23.142670] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:01.467 [2024-06-10 10:07:23.142676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168cbc0 name raid_bdev1, state offline 00:12:01.467 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.467 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:01.727 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:01.987 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:01.987 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:02.248 10:07:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:02.248 [2024-06-10 10:07:24.084932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:02.248 [2024-06-10 10:07:24.085992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:02.248 [2024-06-10 10:07:24.086032] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:02.248 [2024-06-10 10:07:24.086058] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:02.248 [2024-06-10 10:07:24.086068] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:02.248 [2024-06-10 10:07:24.086073] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168dfa0 name raid_bdev1, state configuring 00:12:02.248 request: 00:12:02.248 { 00:12:02.248 "name": "raid_bdev1", 00:12:02.248 "raid_level": "raid1", 00:12:02.248 "base_bdevs": [ 00:12:02.248 "malloc1", 00:12:02.248 "malloc2" 00:12:02.248 ], 00:12:02.248 "superblock": false, 00:12:02.248 "method": "bdev_raid_create", 00:12:02.248 "req_id": 1 00:12:02.248 } 00:12:02.248 Got JSON-RPC error response 00:12:02.248 response: 00:12:02.248 { 00:12:02.248 "code": -17, 00:12:02.248 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:02.248 } 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.248 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:02.508 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:02.508 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:02.508 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:02.768 [2024-06-10 10:07:24.453820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:02.768 [2024-06-10 10:07:24.453843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.768 [2024-06-10 10:07:24.453852] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168c900 00:12:02.768 [2024-06-10 10:07:24.453858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.768 [2024-06-10 10:07:24.455092] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.768 [2024-06-10 10:07:24.455110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:02.768 [2024-06-10 10:07:24.455151] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:02.768 [2024-06-10 10:07:24.455167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:02.768 pt1 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.768 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:03.029 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.029 "name": "raid_bdev1", 00:12:03.029 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:03.029 "strip_size_kb": 0, 00:12:03.029 "state": "configuring", 00:12:03.029 "raid_level": "raid1", 00:12:03.029 "superblock": true, 00:12:03.029 "num_base_bdevs": 2, 00:12:03.029 "num_base_bdevs_discovered": 1, 00:12:03.029 "num_base_bdevs_operational": 2, 00:12:03.029 "base_bdevs_list": [ 00:12:03.029 { 00:12:03.029 "name": "pt1", 00:12:03.029 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.029 "is_configured": true, 00:12:03.029 "data_offset": 2048, 00:12:03.029 "data_size": 63488 00:12:03.029 }, 00:12:03.029 { 00:12:03.029 "name": null, 00:12:03.029 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.029 "is_configured": false, 00:12:03.029 "data_offset": 2048, 00:12:03.029 "data_size": 63488 00:12:03.029 } 00:12:03.029 ] 00:12:03.029 }' 00:12:03.029 10:07:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.029 10:07:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:03.599 [2024-06-10 10:07:25.384179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:03.599 [2024-06-10 10:07:25.384205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.599 [2024-06-10 10:07:25.384215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1695190 00:12:03.599 [2024-06-10 10:07:25.384221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.599 [2024-06-10 10:07:25.384471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.599 [2024-06-10 10:07:25.384481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:03.599 [2024-06-10 10:07:25.384520] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:03.599 [2024-06-10 10:07:25.384532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:03.599 [2024-06-10 10:07:25.384602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1694ba0 00:12:03.599 [2024-06-10 10:07:25.384608] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:03.599 [2024-06-10 10:07:25.384739] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168cb90 00:12:03.599 [2024-06-10 10:07:25.384842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1694ba0 00:12:03.599 [2024-06-10 10:07:25.384849] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1694ba0 00:12:03.599 [2024-06-10 10:07:25.384919] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:03.599 pt2 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.599 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:03.901 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.901 "name": "raid_bdev1", 00:12:03.901 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:03.901 "strip_size_kb": 0, 00:12:03.901 "state": "online", 00:12:03.901 "raid_level": "raid1", 00:12:03.901 "superblock": true, 00:12:03.901 "num_base_bdevs": 2, 00:12:03.901 "num_base_bdevs_discovered": 2, 00:12:03.901 "num_base_bdevs_operational": 2, 00:12:03.901 "base_bdevs_list": [ 00:12:03.901 { 00:12:03.901 "name": "pt1", 00:12:03.901 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.901 "is_configured": true, 00:12:03.901 "data_offset": 2048, 00:12:03.901 "data_size": 63488 00:12:03.901 }, 00:12:03.901 { 00:12:03.901 "name": "pt2", 00:12:03.901 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.901 "is_configured": true, 00:12:03.901 "data_offset": 2048, 00:12:03.901 "data_size": 63488 00:12:03.901 } 00:12:03.901 ] 00:12:03.901 }' 00:12:03.901 10:07:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.901 10:07:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:04.504 [2024-06-10 10:07:26.318725] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:04.504 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:04.504 "name": "raid_bdev1", 00:12:04.504 "aliases": [ 00:12:04.504 "29d77040-5fb4-4bb9-8af7-d7b81d9b0531" 00:12:04.504 ], 00:12:04.504 "product_name": "Raid Volume", 00:12:04.504 "block_size": 512, 00:12:04.504 "num_blocks": 63488, 00:12:04.504 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:04.504 "assigned_rate_limits": { 00:12:04.504 "rw_ios_per_sec": 0, 00:12:04.504 "rw_mbytes_per_sec": 0, 00:12:04.504 "r_mbytes_per_sec": 0, 00:12:04.504 "w_mbytes_per_sec": 0 00:12:04.504 }, 00:12:04.504 "claimed": false, 00:12:04.504 "zoned": false, 00:12:04.504 "supported_io_types": { 00:12:04.504 "read": true, 00:12:04.504 "write": true, 00:12:04.504 "unmap": false, 00:12:04.504 "write_zeroes": true, 00:12:04.504 "flush": false, 00:12:04.504 "reset": true, 00:12:04.504 "compare": false, 00:12:04.504 "compare_and_write": false, 00:12:04.504 "abort": false, 00:12:04.504 "nvme_admin": false, 00:12:04.504 "nvme_io": false 00:12:04.504 }, 00:12:04.504 "memory_domains": [ 00:12:04.504 { 00:12:04.504 "dma_device_id": "system", 00:12:04.504 "dma_device_type": 1 00:12:04.504 }, 00:12:04.504 { 00:12:04.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.504 "dma_device_type": 2 00:12:04.504 }, 00:12:04.504 { 00:12:04.505 "dma_device_id": "system", 00:12:04.505 "dma_device_type": 1 00:12:04.505 }, 00:12:04.505 { 00:12:04.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.505 "dma_device_type": 2 00:12:04.505 } 00:12:04.505 ], 00:12:04.505 "driver_specific": { 00:12:04.505 "raid": { 00:12:04.505 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:04.505 "strip_size_kb": 0, 00:12:04.505 "state": "online", 00:12:04.505 "raid_level": "raid1", 00:12:04.505 "superblock": true, 00:12:04.505 "num_base_bdevs": 2, 00:12:04.505 "num_base_bdevs_discovered": 2, 00:12:04.505 "num_base_bdevs_operational": 2, 00:12:04.505 "base_bdevs_list": [ 00:12:04.505 { 00:12:04.505 "name": "pt1", 00:12:04.505 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:04.505 "is_configured": true, 00:12:04.505 "data_offset": 2048, 00:12:04.505 "data_size": 63488 00:12:04.505 }, 00:12:04.505 { 00:12:04.505 "name": "pt2", 00:12:04.505 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:04.505 "is_configured": true, 00:12:04.505 "data_offset": 2048, 00:12:04.505 "data_size": 63488 00:12:04.505 } 00:12:04.505 ] 00:12:04.505 } 00:12:04.505 } 00:12:04.505 }' 00:12:04.505 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:04.764 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:04.765 pt2' 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.765 "name": "pt1", 00:12:04.765 "aliases": [ 00:12:04.765 "00000000-0000-0000-0000-000000000001" 00:12:04.765 ], 00:12:04.765 "product_name": "passthru", 00:12:04.765 "block_size": 512, 00:12:04.765 "num_blocks": 65536, 00:12:04.765 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:04.765 "assigned_rate_limits": { 00:12:04.765 "rw_ios_per_sec": 0, 00:12:04.765 "rw_mbytes_per_sec": 0, 00:12:04.765 "r_mbytes_per_sec": 0, 00:12:04.765 "w_mbytes_per_sec": 0 00:12:04.765 }, 00:12:04.765 "claimed": true, 00:12:04.765 "claim_type": "exclusive_write", 00:12:04.765 "zoned": false, 00:12:04.765 "supported_io_types": { 00:12:04.765 "read": true, 00:12:04.765 "write": true, 00:12:04.765 "unmap": true, 00:12:04.765 "write_zeroes": true, 00:12:04.765 "flush": true, 00:12:04.765 "reset": true, 00:12:04.765 "compare": false, 00:12:04.765 "compare_and_write": false, 00:12:04.765 "abort": true, 00:12:04.765 "nvme_admin": false, 00:12:04.765 "nvme_io": false 00:12:04.765 }, 00:12:04.765 "memory_domains": [ 00:12:04.765 { 00:12:04.765 "dma_device_id": "system", 00:12:04.765 "dma_device_type": 1 00:12:04.765 }, 00:12:04.765 { 00:12:04.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.765 "dma_device_type": 2 00:12:04.765 } 00:12:04.765 ], 00:12:04.765 "driver_specific": { 00:12:04.765 "passthru": { 00:12:04.765 "name": "pt1", 00:12:04.765 "base_bdev_name": "malloc1" 00:12:04.765 } 00:12:04.765 } 00:12:04.765 }' 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.765 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.024 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.284 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:05.284 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:05.284 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:05.284 10:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:05.284 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:05.284 "name": "pt2", 00:12:05.284 "aliases": [ 00:12:05.284 "00000000-0000-0000-0000-000000000002" 00:12:05.284 ], 00:12:05.284 "product_name": "passthru", 00:12:05.284 "block_size": 512, 00:12:05.284 "num_blocks": 65536, 00:12:05.284 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:05.284 "assigned_rate_limits": { 00:12:05.284 "rw_ios_per_sec": 0, 00:12:05.284 "rw_mbytes_per_sec": 0, 00:12:05.284 "r_mbytes_per_sec": 0, 00:12:05.284 "w_mbytes_per_sec": 0 00:12:05.284 }, 00:12:05.284 "claimed": true, 00:12:05.284 "claim_type": "exclusive_write", 00:12:05.284 "zoned": false, 00:12:05.284 "supported_io_types": { 00:12:05.284 "read": true, 00:12:05.284 "write": true, 00:12:05.284 "unmap": true, 00:12:05.284 "write_zeroes": true, 00:12:05.284 "flush": true, 00:12:05.284 "reset": true, 00:12:05.284 "compare": false, 00:12:05.284 "compare_and_write": false, 00:12:05.284 "abort": true, 00:12:05.284 "nvme_admin": false, 00:12:05.284 "nvme_io": false 00:12:05.284 }, 00:12:05.284 "memory_domains": [ 00:12:05.284 { 00:12:05.284 "dma_device_id": "system", 00:12:05.284 "dma_device_type": 1 00:12:05.284 }, 00:12:05.284 { 00:12:05.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.284 "dma_device_type": 2 00:12:05.284 } 00:12:05.284 ], 00:12:05.284 "driver_specific": { 00:12:05.284 "passthru": { 00:12:05.284 "name": "pt2", 00:12:05.284 "base_bdev_name": "malloc2" 00:12:05.284 } 00:12:05.284 } 00:12:05.284 }' 00:12:05.284 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.544 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:05.804 [2024-06-10 10:07:27.642068] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 29d77040-5fb4-4bb9-8af7-d7b81d9b0531 '!=' 29d77040-5fb4-4bb9-8af7-d7b81d9b0531 ']' 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:05.804 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:06.065 [2024-06-10 10:07:27.834400] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.065 10:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:06.325 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.325 "name": "raid_bdev1", 00:12:06.325 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:06.325 "strip_size_kb": 0, 00:12:06.325 "state": "online", 00:12:06.325 "raid_level": "raid1", 00:12:06.325 "superblock": true, 00:12:06.325 "num_base_bdevs": 2, 00:12:06.325 "num_base_bdevs_discovered": 1, 00:12:06.325 "num_base_bdevs_operational": 1, 00:12:06.325 "base_bdevs_list": [ 00:12:06.325 { 00:12:06.325 "name": null, 00:12:06.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.325 "is_configured": false, 00:12:06.325 "data_offset": 2048, 00:12:06.325 "data_size": 63488 00:12:06.325 }, 00:12:06.325 { 00:12:06.325 "name": "pt2", 00:12:06.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:06.325 "is_configured": true, 00:12:06.325 "data_offset": 2048, 00:12:06.325 "data_size": 63488 00:12:06.325 } 00:12:06.325 ] 00:12:06.325 }' 00:12:06.325 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.325 10:07:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.895 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:07.155 [2024-06-10 10:07:28.764740] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:07.155 [2024-06-10 10:07:28.764755] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:07.155 [2024-06-10 10:07:28.764789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:07.155 [2024-06-10 10:07:28.764818] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:07.155 [2024-06-10 10:07:28.764827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1694ba0 name raid_bdev1, state offline 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:07.156 10:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:07.416 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:07.676 [2024-06-10 10:07:29.318123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:07.676 [2024-06-10 10:07:29.318150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.676 [2024-06-10 10:07:29.318159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1691660 00:12:07.676 [2024-06-10 10:07:29.318166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.676 [2024-06-10 10:07:29.319431] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.676 [2024-06-10 10:07:29.319450] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:07.676 [2024-06-10 10:07:29.319501] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:07.676 [2024-06-10 10:07:29.319518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:07.676 [2024-06-10 10:07:29.319581] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x168dc10 00:12:07.676 [2024-06-10 10:07:29.319588] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:07.676 [2024-06-10 10:07:29.319726] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1692080 00:12:07.676 [2024-06-10 10:07:29.319819] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x168dc10 00:12:07.676 [2024-06-10 10:07:29.319832] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x168dc10 00:12:07.676 [2024-06-10 10:07:29.319904] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.676 pt2 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.676 "name": "raid_bdev1", 00:12:07.676 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:07.676 "strip_size_kb": 0, 00:12:07.676 "state": "online", 00:12:07.676 "raid_level": "raid1", 00:12:07.676 "superblock": true, 00:12:07.676 "num_base_bdevs": 2, 00:12:07.676 "num_base_bdevs_discovered": 1, 00:12:07.676 "num_base_bdevs_operational": 1, 00:12:07.676 "base_bdevs_list": [ 00:12:07.676 { 00:12:07.676 "name": null, 00:12:07.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.676 "is_configured": false, 00:12:07.676 "data_offset": 2048, 00:12:07.676 "data_size": 63488 00:12:07.676 }, 00:12:07.676 { 00:12:07.676 "name": "pt2", 00:12:07.676 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:07.676 "is_configured": true, 00:12:07.676 "data_offset": 2048, 00:12:07.676 "data_size": 63488 00:12:07.676 } 00:12:07.676 ] 00:12:07.676 }' 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.676 10:07:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.248 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:08.508 [2024-06-10 10:07:30.196342] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:08.508 [2024-06-10 10:07:30.196360] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:08.508 [2024-06-10 10:07:30.196397] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:08.508 [2024-06-10 10:07:30.196427] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:08.508 [2024-06-10 10:07:30.196433] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168dc10 name raid_bdev1, state offline 00:12:08.508 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.508 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:08.768 [2024-06-10 10:07:30.581301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:08.768 [2024-06-10 10:07:30.581326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:08.768 [2024-06-10 10:07:30.581335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1694e20 00:12:08.768 [2024-06-10 10:07:30.581341] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:08.768 [2024-06-10 10:07:30.582603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:08.768 [2024-06-10 10:07:30.582621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:08.768 [2024-06-10 10:07:30.582667] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:08.768 [2024-06-10 10:07:30.582684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:08.768 [2024-06-10 10:07:30.582755] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:08.768 [2024-06-10 10:07:30.582762] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:08.768 [2024-06-10 10:07:30.582770] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16922a0 name raid_bdev1, state configuring 00:12:08.768 [2024-06-10 10:07:30.582784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:08.768 [2024-06-10 10:07:30.582837] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16919d0 00:12:08.768 [2024-06-10 10:07:30.582843] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:08.768 [2024-06-10 10:07:30.582977] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1692350 00:12:08.768 [2024-06-10 10:07:30.583069] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16919d0 00:12:08.768 [2024-06-10 10:07:30.583074] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16919d0 00:12:08.768 [2024-06-10 10:07:30.583147] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:08.768 pt1 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.768 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.028 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.028 "name": "raid_bdev1", 00:12:09.028 "uuid": "29d77040-5fb4-4bb9-8af7-d7b81d9b0531", 00:12:09.028 "strip_size_kb": 0, 00:12:09.028 "state": "online", 00:12:09.028 "raid_level": "raid1", 00:12:09.028 "superblock": true, 00:12:09.028 "num_base_bdevs": 2, 00:12:09.028 "num_base_bdevs_discovered": 1, 00:12:09.028 "num_base_bdevs_operational": 1, 00:12:09.028 "base_bdevs_list": [ 00:12:09.028 { 00:12:09.028 "name": null, 00:12:09.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.028 "is_configured": false, 00:12:09.028 "data_offset": 2048, 00:12:09.028 "data_size": 63488 00:12:09.028 }, 00:12:09.028 { 00:12:09.028 "name": "pt2", 00:12:09.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:09.028 "is_configured": true, 00:12:09.028 "data_offset": 2048, 00:12:09.028 "data_size": 63488 00:12:09.028 } 00:12:09.028 ] 00:12:09.028 }' 00:12:09.028 10:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.028 10:07:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.599 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:09.599 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:09.859 [2024-06-10 10:07:31.704305] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 29d77040-5fb4-4bb9-8af7-d7b81d9b0531 '!=' 29d77040-5fb4-4bb9-8af7-d7b81d9b0531 ']' 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 974347 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 974347 ']' 00:12:09.859 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 974347 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 974347 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 974347' 00:12:10.175 killing process with pid 974347 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 974347 00:12:10.175 [2024-06-10 10:07:31.774720] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:10.175 [2024-06-10 10:07:31.774755] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.175 [2024-06-10 10:07:31.774783] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.175 [2024-06-10 10:07:31.774788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16919d0 name raid_bdev1, state offline 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 974347 00:12:10.175 [2024-06-10 10:07:31.783957] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:10.175 00:12:10.175 real 0m12.981s 00:12:10.175 user 0m24.124s 00:12:10.175 sys 0m1.914s 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:10.175 10:07:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.175 ************************************ 00:12:10.175 END TEST raid_superblock_test 00:12:10.175 ************************************ 00:12:10.175 10:07:31 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:10.175 10:07:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:10.175 10:07:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:10.175 10:07:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:10.175 ************************************ 00:12:10.175 START TEST raid_read_error_test 00:12:10.175 ************************************ 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 read 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:10.175 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qpniwU5IDz 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=976856 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 976856 /var/tmp/spdk-raid.sock 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 976856 ']' 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:10.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:10.176 10:07:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.435 [2024-06-10 10:07:32.045765] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:12:10.435 [2024-06-10 10:07:32.045818] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid976856 ] 00:12:10.435 [2024-06-10 10:07:32.136038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:10.435 [2024-06-10 10:07:32.204229] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:10.435 [2024-06-10 10:07:32.248182] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:10.435 [2024-06-10 10:07:32.248211] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.373 10:07:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:11.373 10:07:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:12:11.373 10:07:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:11.373 10:07:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:11.373 BaseBdev1_malloc 00:12:11.373 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:11.633 true 00:12:11.633 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:11.633 [2024-06-10 10:07:33.438698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:11.633 [2024-06-10 10:07:33.438729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:11.633 [2024-06-10 10:07:33.438739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24abd10 00:12:11.633 [2024-06-10 10:07:33.438746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:11.633 [2024-06-10 10:07:33.440095] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:11.633 [2024-06-10 10:07:33.440114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:11.633 BaseBdev1 00:12:11.633 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:11.633 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:11.892 BaseBdev2_malloc 00:12:11.892 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:12.152 true 00:12:12.152 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:12.152 [2024-06-10 10:07:33.981930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:12.152 [2024-06-10 10:07:33.981957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.152 [2024-06-10 10:07:33.981968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24b0710 00:12:12.152 [2024-06-10 10:07:33.981974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.152 [2024-06-10 10:07:33.983152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.152 [2024-06-10 10:07:33.983171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:12.152 BaseBdev2 00:12:12.152 10:07:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:12.412 [2024-06-10 10:07:34.170422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:12.412 [2024-06-10 10:07:34.171429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:12.412 [2024-06-10 10:07:34.171571] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24b19f0 00:12:12.412 [2024-06-10 10:07:34.171579] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:12.412 [2024-06-10 10:07:34.171722] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24b1cd0 00:12:12.412 [2024-06-10 10:07:34.171844] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24b19f0 00:12:12.412 [2024-06-10 10:07:34.171850] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24b19f0 00:12:12.412 [2024-06-10 10:07:34.171929] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.412 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.672 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.672 "name": "raid_bdev1", 00:12:12.672 "uuid": "37b95947-cd08-416d-89b4-40bea04acdfc", 00:12:12.672 "strip_size_kb": 0, 00:12:12.672 "state": "online", 00:12:12.672 "raid_level": "raid1", 00:12:12.672 "superblock": true, 00:12:12.672 "num_base_bdevs": 2, 00:12:12.672 "num_base_bdevs_discovered": 2, 00:12:12.672 "num_base_bdevs_operational": 2, 00:12:12.672 "base_bdevs_list": [ 00:12:12.672 { 00:12:12.672 "name": "BaseBdev1", 00:12:12.672 "uuid": "d174c952-7894-52be-b8ed-789605eaede1", 00:12:12.672 "is_configured": true, 00:12:12.672 "data_offset": 2048, 00:12:12.672 "data_size": 63488 00:12:12.672 }, 00:12:12.672 { 00:12:12.672 "name": "BaseBdev2", 00:12:12.672 "uuid": "215b1873-99fe-5eed-a2e2-c6e55fec1a7f", 00:12:12.672 "is_configured": true, 00:12:12.672 "data_offset": 2048, 00:12:12.672 "data_size": 63488 00:12:12.672 } 00:12:12.672 ] 00:12:12.672 }' 00:12:12.672 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.672 10:07:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.241 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:13.241 10:07:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:13.241 [2024-06-10 10:07:34.972628] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24acf40 00:12:14.179 10:07:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.439 "name": "raid_bdev1", 00:12:14.439 "uuid": "37b95947-cd08-416d-89b4-40bea04acdfc", 00:12:14.439 "strip_size_kb": 0, 00:12:14.439 "state": "online", 00:12:14.439 "raid_level": "raid1", 00:12:14.439 "superblock": true, 00:12:14.439 "num_base_bdevs": 2, 00:12:14.439 "num_base_bdevs_discovered": 2, 00:12:14.439 "num_base_bdevs_operational": 2, 00:12:14.439 "base_bdevs_list": [ 00:12:14.439 { 00:12:14.439 "name": "BaseBdev1", 00:12:14.439 "uuid": "d174c952-7894-52be-b8ed-789605eaede1", 00:12:14.439 "is_configured": true, 00:12:14.439 "data_offset": 2048, 00:12:14.439 "data_size": 63488 00:12:14.439 }, 00:12:14.439 { 00:12:14.439 "name": "BaseBdev2", 00:12:14.439 "uuid": "215b1873-99fe-5eed-a2e2-c6e55fec1a7f", 00:12:14.439 "is_configured": true, 00:12:14.439 "data_offset": 2048, 00:12:14.439 "data_size": 63488 00:12:14.439 } 00:12:14.439 ] 00:12:14.439 }' 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.439 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.008 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:15.267 [2024-06-10 10:07:36.956463] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.267 [2024-06-10 10:07:36.956493] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:15.267 [2024-06-10 10:07:36.959104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.267 [2024-06-10 10:07:36.959125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.267 [2024-06-10 10:07:36.959182] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.267 [2024-06-10 10:07:36.959189] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b19f0 name raid_bdev1, state offline 00:12:15.267 0 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 976856 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 976856 ']' 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 976856 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:15.267 10:07:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 976856 00:12:15.267 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:15.267 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:15.267 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 976856' 00:12:15.267 killing process with pid 976856 00:12:15.267 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 976856 00:12:15.267 [2024-06-10 10:07:37.024955] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:15.267 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 976856 00:12:15.267 [2024-06-10 10:07:37.030626] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qpniwU5IDz 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:15.600 00:12:15.600 real 0m5.183s 00:12:15.600 user 0m8.125s 00:12:15.600 sys 0m0.720s 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:15.600 10:07:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.600 ************************************ 00:12:15.600 END TEST raid_read_error_test 00:12:15.600 ************************************ 00:12:15.600 10:07:37 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:15.600 10:07:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:15.600 10:07:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:15.600 10:07:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:15.600 ************************************ 00:12:15.600 START TEST raid_write_error_test 00:12:15.600 ************************************ 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 write 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.P85MGJARI8 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=977868 00:12:15.600 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 977868 /var/tmp/spdk-raid.sock 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 977868 ']' 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:15.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:15.601 10:07:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.601 [2024-06-10 10:07:37.300326] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:12:15.601 [2024-06-10 10:07:37.300371] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid977868 ] 00:12:15.601 [2024-06-10 10:07:37.386740] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:15.601 [2024-06-10 10:07:37.450343] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:15.861 [2024-06-10 10:07:37.491773] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:15.861 [2024-06-10 10:07:37.491794] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.429 10:07:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:16.429 10:07:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:12:16.429 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:16.429 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:16.689 BaseBdev1_malloc 00:12:16.689 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:16.689 true 00:12:16.689 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:16.949 [2024-06-10 10:07:38.666393] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:16.949 [2024-06-10 10:07:38.666427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:16.949 [2024-06-10 10:07:38.666438] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2317d10 00:12:16.949 [2024-06-10 10:07:38.666445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:16.949 [2024-06-10 10:07:38.667776] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:16.949 [2024-06-10 10:07:38.667796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:16.949 BaseBdev1 00:12:16.949 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:16.949 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:17.208 BaseBdev2_malloc 00:12:17.208 10:07:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:17.208 true 00:12:17.208 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:17.469 [2024-06-10 10:07:39.209454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:17.469 [2024-06-10 10:07:39.209481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:17.469 [2024-06-10 10:07:39.209490] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x231c710 00:12:17.469 [2024-06-10 10:07:39.209497] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:17.469 [2024-06-10 10:07:39.210632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:17.469 [2024-06-10 10:07:39.210650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:17.469 BaseBdev2 00:12:17.469 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:17.731 [2024-06-10 10:07:39.389928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:17.731 [2024-06-10 10:07:39.390915] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:17.731 [2024-06-10 10:07:39.391051] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x231d9f0 00:12:17.731 [2024-06-10 10:07:39.391059] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:17.731 [2024-06-10 10:07:39.391192] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x231dcd0 00:12:17.731 [2024-06-10 10:07:39.391304] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x231d9f0 00:12:17.731 [2024-06-10 10:07:39.391310] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x231d9f0 00:12:17.731 [2024-06-10 10:07:39.391382] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.731 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:17.992 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.992 "name": "raid_bdev1", 00:12:17.992 "uuid": "09be6d22-7898-4920-9904-2434e36332df", 00:12:17.992 "strip_size_kb": 0, 00:12:17.992 "state": "online", 00:12:17.992 "raid_level": "raid1", 00:12:17.992 "superblock": true, 00:12:17.992 "num_base_bdevs": 2, 00:12:17.992 "num_base_bdevs_discovered": 2, 00:12:17.992 "num_base_bdevs_operational": 2, 00:12:17.992 "base_bdevs_list": [ 00:12:17.992 { 00:12:17.992 "name": "BaseBdev1", 00:12:17.992 "uuid": "978cb4e0-0cbf-5676-aa8b-eb35de43a416", 00:12:17.992 "is_configured": true, 00:12:17.992 "data_offset": 2048, 00:12:17.992 "data_size": 63488 00:12:17.992 }, 00:12:17.992 { 00:12:17.992 "name": "BaseBdev2", 00:12:17.992 "uuid": "5b908e7c-42bd-56db-8eeb-7b1828b44e4c", 00:12:17.992 "is_configured": true, 00:12:17.992 "data_offset": 2048, 00:12:17.992 "data_size": 63488 00:12:17.992 } 00:12:17.992 ] 00:12:17.992 }' 00:12:17.992 10:07:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.992 10:07:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.563 10:07:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:18.563 10:07:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:18.563 [2024-06-10 10:07:40.232283] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2318f40 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:19.506 [2024-06-10 10:07:41.336833] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:12:19.506 [2024-06-10 10:07:41.336881] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:19.506 [2024-06-10 10:07:41.337042] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2318f40 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.506 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:19.766 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.766 "name": "raid_bdev1", 00:12:19.766 "uuid": "09be6d22-7898-4920-9904-2434e36332df", 00:12:19.766 "strip_size_kb": 0, 00:12:19.766 "state": "online", 00:12:19.766 "raid_level": "raid1", 00:12:19.766 "superblock": true, 00:12:19.766 "num_base_bdevs": 2, 00:12:19.766 "num_base_bdevs_discovered": 1, 00:12:19.766 "num_base_bdevs_operational": 1, 00:12:19.766 "base_bdevs_list": [ 00:12:19.766 { 00:12:19.766 "name": null, 00:12:19.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.766 "is_configured": false, 00:12:19.766 "data_offset": 2048, 00:12:19.766 "data_size": 63488 00:12:19.766 }, 00:12:19.766 { 00:12:19.766 "name": "BaseBdev2", 00:12:19.766 "uuid": "5b908e7c-42bd-56db-8eeb-7b1828b44e4c", 00:12:19.766 "is_configured": true, 00:12:19.766 "data_offset": 2048, 00:12:19.766 "data_size": 63488 00:12:19.766 } 00:12:19.766 ] 00:12:19.766 }' 00:12:19.766 10:07:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.766 10:07:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.338 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:20.598 [2024-06-10 10:07:42.283053] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:20.598 [2024-06-10 10:07:42.283077] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:20.598 [2024-06-10 10:07:42.285674] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:20.598 [2024-06-10 10:07:42.285696] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:20.598 [2024-06-10 10:07:42.285734] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:20.598 [2024-06-10 10:07:42.285741] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x231d9f0 name raid_bdev1, state offline 00:12:20.598 0 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 977868 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 977868 ']' 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 977868 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 977868 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 977868' 00:12:20.598 killing process with pid 977868 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 977868 00:12:20.598 [2024-06-10 10:07:42.352073] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:20.598 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 977868 00:12:20.598 [2024-06-10 10:07:42.357358] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:20.858 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.P85MGJARI8 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:20.859 00:12:20.859 real 0m5.254s 00:12:20.859 user 0m8.274s 00:12:20.859 sys 0m0.710s 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:20.859 10:07:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.859 ************************************ 00:12:20.859 END TEST raid_write_error_test 00:12:20.859 ************************************ 00:12:20.859 10:07:42 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:20.859 10:07:42 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:20.859 10:07:42 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:20.859 10:07:42 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:20.859 10:07:42 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:20.859 10:07:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:20.859 ************************************ 00:12:20.859 START TEST raid_state_function_test 00:12:20.859 ************************************ 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 false 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=978880 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 978880' 00:12:20.859 Process raid pid: 978880 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 978880 /var/tmp/spdk-raid.sock 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 978880 ']' 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:20.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:20.859 10:07:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.859 [2024-06-10 10:07:42.625700] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:12:20.859 [2024-06-10 10:07:42.625745] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:20.859 [2024-06-10 10:07:42.713699] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:21.119 [2024-06-10 10:07:42.777474] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:21.119 [2024-06-10 10:07:42.817966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:21.119 [2024-06-10 10:07:42.817988] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:21.689 10:07:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:21.689 10:07:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:12:21.689 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:21.950 [2024-06-10 10:07:43.633029] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:21.950 [2024-06-10 10:07:43.633057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:21.950 [2024-06-10 10:07:43.633063] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:21.950 [2024-06-10 10:07:43.633069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:21.950 [2024-06-10 10:07:43.633074] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:21.950 [2024-06-10 10:07:43.633083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.950 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.211 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.211 "name": "Existed_Raid", 00:12:22.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.211 "strip_size_kb": 64, 00:12:22.211 "state": "configuring", 00:12:22.211 "raid_level": "raid0", 00:12:22.211 "superblock": false, 00:12:22.211 "num_base_bdevs": 3, 00:12:22.211 "num_base_bdevs_discovered": 0, 00:12:22.211 "num_base_bdevs_operational": 3, 00:12:22.211 "base_bdevs_list": [ 00:12:22.211 { 00:12:22.211 "name": "BaseBdev1", 00:12:22.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.211 "is_configured": false, 00:12:22.211 "data_offset": 0, 00:12:22.211 "data_size": 0 00:12:22.211 }, 00:12:22.211 { 00:12:22.211 "name": "BaseBdev2", 00:12:22.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.211 "is_configured": false, 00:12:22.211 "data_offset": 0, 00:12:22.211 "data_size": 0 00:12:22.211 }, 00:12:22.211 { 00:12:22.211 "name": "BaseBdev3", 00:12:22.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.211 "is_configured": false, 00:12:22.211 "data_offset": 0, 00:12:22.211 "data_size": 0 00:12:22.211 } 00:12:22.211 ] 00:12:22.211 }' 00:12:22.211 10:07:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.211 10:07:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.822 10:07:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:22.822 [2024-06-10 10:07:44.563266] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:22.822 [2024-06-10 10:07:44.563283] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1271b00 name Existed_Raid, state configuring 00:12:22.822 10:07:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:23.087 [2024-06-10 10:07:44.751758] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:23.087 [2024-06-10 10:07:44.751775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:23.087 [2024-06-10 10:07:44.751780] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:23.087 [2024-06-10 10:07:44.751786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:23.087 [2024-06-10 10:07:44.751790] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:23.087 [2024-06-10 10:07:44.751796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:23.087 10:07:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:23.087 [2024-06-10 10:07:44.946951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:23.087 BaseBdev1 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:23.348 10:07:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:23.348 10:07:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:23.609 [ 00:12:23.609 { 00:12:23.609 "name": "BaseBdev1", 00:12:23.609 "aliases": [ 00:12:23.609 "a59bd615-0fb8-45ec-8503-b0cd7d256ab4" 00:12:23.609 ], 00:12:23.609 "product_name": "Malloc disk", 00:12:23.609 "block_size": 512, 00:12:23.609 "num_blocks": 65536, 00:12:23.609 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:23.609 "assigned_rate_limits": { 00:12:23.609 "rw_ios_per_sec": 0, 00:12:23.609 "rw_mbytes_per_sec": 0, 00:12:23.609 "r_mbytes_per_sec": 0, 00:12:23.609 "w_mbytes_per_sec": 0 00:12:23.609 }, 00:12:23.609 "claimed": true, 00:12:23.609 "claim_type": "exclusive_write", 00:12:23.609 "zoned": false, 00:12:23.609 "supported_io_types": { 00:12:23.609 "read": true, 00:12:23.609 "write": true, 00:12:23.609 "unmap": true, 00:12:23.609 "write_zeroes": true, 00:12:23.609 "flush": true, 00:12:23.609 "reset": true, 00:12:23.609 "compare": false, 00:12:23.609 "compare_and_write": false, 00:12:23.609 "abort": true, 00:12:23.609 "nvme_admin": false, 00:12:23.609 "nvme_io": false 00:12:23.609 }, 00:12:23.609 "memory_domains": [ 00:12:23.609 { 00:12:23.609 "dma_device_id": "system", 00:12:23.609 "dma_device_type": 1 00:12:23.609 }, 00:12:23.609 { 00:12:23.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.609 "dma_device_type": 2 00:12:23.609 } 00:12:23.609 ], 00:12:23.609 "driver_specific": {} 00:12:23.609 } 00:12:23.609 ] 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.609 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.869 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.869 "name": "Existed_Raid", 00:12:23.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.869 "strip_size_kb": 64, 00:12:23.869 "state": "configuring", 00:12:23.869 "raid_level": "raid0", 00:12:23.869 "superblock": false, 00:12:23.869 "num_base_bdevs": 3, 00:12:23.869 "num_base_bdevs_discovered": 1, 00:12:23.869 "num_base_bdevs_operational": 3, 00:12:23.869 "base_bdevs_list": [ 00:12:23.869 { 00:12:23.869 "name": "BaseBdev1", 00:12:23.869 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:23.869 "is_configured": true, 00:12:23.869 "data_offset": 0, 00:12:23.869 "data_size": 65536 00:12:23.869 }, 00:12:23.869 { 00:12:23.869 "name": "BaseBdev2", 00:12:23.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.869 "is_configured": false, 00:12:23.869 "data_offset": 0, 00:12:23.869 "data_size": 0 00:12:23.869 }, 00:12:23.869 { 00:12:23.869 "name": "BaseBdev3", 00:12:23.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.869 "is_configured": false, 00:12:23.869 "data_offset": 0, 00:12:23.869 "data_size": 0 00:12:23.869 } 00:12:23.869 ] 00:12:23.869 }' 00:12:23.869 10:07:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.869 10:07:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.440 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:24.440 [2024-06-10 10:07:46.226170] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:24.440 [2024-06-10 10:07:46.226195] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12713f0 name Existed_Raid, state configuring 00:12:24.440 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:24.700 [2024-06-10 10:07:46.418681] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:24.700 [2024-06-10 10:07:46.419802] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:24.700 [2024-06-10 10:07:46.419831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:24.700 [2024-06-10 10:07:46.419837] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:24.700 [2024-06-10 10:07:46.419843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.700 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.960 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.960 "name": "Existed_Raid", 00:12:24.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.960 "strip_size_kb": 64, 00:12:24.960 "state": "configuring", 00:12:24.960 "raid_level": "raid0", 00:12:24.960 "superblock": false, 00:12:24.960 "num_base_bdevs": 3, 00:12:24.960 "num_base_bdevs_discovered": 1, 00:12:24.960 "num_base_bdevs_operational": 3, 00:12:24.960 "base_bdevs_list": [ 00:12:24.960 { 00:12:24.960 "name": "BaseBdev1", 00:12:24.960 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:24.960 "is_configured": true, 00:12:24.960 "data_offset": 0, 00:12:24.960 "data_size": 65536 00:12:24.960 }, 00:12:24.960 { 00:12:24.960 "name": "BaseBdev2", 00:12:24.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.960 "is_configured": false, 00:12:24.960 "data_offset": 0, 00:12:24.960 "data_size": 0 00:12:24.960 }, 00:12:24.960 { 00:12:24.960 "name": "BaseBdev3", 00:12:24.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.960 "is_configured": false, 00:12:24.960 "data_offset": 0, 00:12:24.960 "data_size": 0 00:12:24.960 } 00:12:24.960 ] 00:12:24.960 }' 00:12:24.960 10:07:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.960 10:07:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:25.531 [2024-06-10 10:07:47.361945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:25.531 BaseBdev2 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:25.531 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:25.791 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:26.051 [ 00:12:26.051 { 00:12:26.051 "name": "BaseBdev2", 00:12:26.051 "aliases": [ 00:12:26.051 "dde59b0c-77a5-4c13-92bd-c11ee9f7f627" 00:12:26.051 ], 00:12:26.051 "product_name": "Malloc disk", 00:12:26.051 "block_size": 512, 00:12:26.051 "num_blocks": 65536, 00:12:26.051 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:26.051 "assigned_rate_limits": { 00:12:26.051 "rw_ios_per_sec": 0, 00:12:26.051 "rw_mbytes_per_sec": 0, 00:12:26.051 "r_mbytes_per_sec": 0, 00:12:26.051 "w_mbytes_per_sec": 0 00:12:26.051 }, 00:12:26.051 "claimed": true, 00:12:26.051 "claim_type": "exclusive_write", 00:12:26.051 "zoned": false, 00:12:26.051 "supported_io_types": { 00:12:26.051 "read": true, 00:12:26.051 "write": true, 00:12:26.051 "unmap": true, 00:12:26.051 "write_zeroes": true, 00:12:26.051 "flush": true, 00:12:26.051 "reset": true, 00:12:26.051 "compare": false, 00:12:26.051 "compare_and_write": false, 00:12:26.051 "abort": true, 00:12:26.051 "nvme_admin": false, 00:12:26.051 "nvme_io": false 00:12:26.051 }, 00:12:26.051 "memory_domains": [ 00:12:26.051 { 00:12:26.051 "dma_device_id": "system", 00:12:26.051 "dma_device_type": 1 00:12:26.051 }, 00:12:26.051 { 00:12:26.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.051 "dma_device_type": 2 00:12:26.051 } 00:12:26.051 ], 00:12:26.051 "driver_specific": {} 00:12:26.051 } 00:12:26.051 ] 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.051 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.311 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.311 "name": "Existed_Raid", 00:12:26.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.311 "strip_size_kb": 64, 00:12:26.311 "state": "configuring", 00:12:26.311 "raid_level": "raid0", 00:12:26.311 "superblock": false, 00:12:26.311 "num_base_bdevs": 3, 00:12:26.311 "num_base_bdevs_discovered": 2, 00:12:26.311 "num_base_bdevs_operational": 3, 00:12:26.311 "base_bdevs_list": [ 00:12:26.311 { 00:12:26.311 "name": "BaseBdev1", 00:12:26.311 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:26.311 "is_configured": true, 00:12:26.311 "data_offset": 0, 00:12:26.311 "data_size": 65536 00:12:26.311 }, 00:12:26.311 { 00:12:26.311 "name": "BaseBdev2", 00:12:26.311 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:26.311 "is_configured": true, 00:12:26.311 "data_offset": 0, 00:12:26.311 "data_size": 65536 00:12:26.311 }, 00:12:26.311 { 00:12:26.311 "name": "BaseBdev3", 00:12:26.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.311 "is_configured": false, 00:12:26.311 "data_offset": 0, 00:12:26.311 "data_size": 0 00:12:26.311 } 00:12:26.311 ] 00:12:26.311 }' 00:12:26.311 10:07:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.311 10:07:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:26.881 [2024-06-10 10:07:48.634174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:26.881 [2024-06-10 10:07:48.634203] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12722c0 00:12:26.881 [2024-06-10 10:07:48.634207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:26.881 [2024-06-10 10:07:48.634350] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1415970 00:12:26.881 [2024-06-10 10:07:48.634441] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12722c0 00:12:26.881 [2024-06-10 10:07:48.634446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12722c0 00:12:26.881 [2024-06-10 10:07:48.634562] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.881 BaseBdev3 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:26.881 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.141 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:27.141 [ 00:12:27.141 { 00:12:27.141 "name": "BaseBdev3", 00:12:27.141 "aliases": [ 00:12:27.141 "77e1bfa9-e588-4c1d-860c-799d0b1d056a" 00:12:27.141 ], 00:12:27.141 "product_name": "Malloc disk", 00:12:27.141 "block_size": 512, 00:12:27.141 "num_blocks": 65536, 00:12:27.141 "uuid": "77e1bfa9-e588-4c1d-860c-799d0b1d056a", 00:12:27.141 "assigned_rate_limits": { 00:12:27.141 "rw_ios_per_sec": 0, 00:12:27.141 "rw_mbytes_per_sec": 0, 00:12:27.141 "r_mbytes_per_sec": 0, 00:12:27.141 "w_mbytes_per_sec": 0 00:12:27.141 }, 00:12:27.141 "claimed": true, 00:12:27.141 "claim_type": "exclusive_write", 00:12:27.141 "zoned": false, 00:12:27.141 "supported_io_types": { 00:12:27.141 "read": true, 00:12:27.141 "write": true, 00:12:27.141 "unmap": true, 00:12:27.141 "write_zeroes": true, 00:12:27.141 "flush": true, 00:12:27.141 "reset": true, 00:12:27.141 "compare": false, 00:12:27.142 "compare_and_write": false, 00:12:27.142 "abort": true, 00:12:27.142 "nvme_admin": false, 00:12:27.142 "nvme_io": false 00:12:27.142 }, 00:12:27.142 "memory_domains": [ 00:12:27.142 { 00:12:27.142 "dma_device_id": "system", 00:12:27.142 "dma_device_type": 1 00:12:27.142 }, 00:12:27.142 { 00:12:27.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.142 "dma_device_type": 2 00:12:27.142 } 00:12:27.142 ], 00:12:27.142 "driver_specific": {} 00:12:27.142 } 00:12:27.142 ] 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.142 10:07:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.402 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.402 "name": "Existed_Raid", 00:12:27.402 "uuid": "a0d4430b-d9e7-4eb8-a409-059c94c185be", 00:12:27.402 "strip_size_kb": 64, 00:12:27.402 "state": "online", 00:12:27.402 "raid_level": "raid0", 00:12:27.402 "superblock": false, 00:12:27.402 "num_base_bdevs": 3, 00:12:27.402 "num_base_bdevs_discovered": 3, 00:12:27.402 "num_base_bdevs_operational": 3, 00:12:27.402 "base_bdevs_list": [ 00:12:27.402 { 00:12:27.402 "name": "BaseBdev1", 00:12:27.402 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:27.402 "is_configured": true, 00:12:27.402 "data_offset": 0, 00:12:27.402 "data_size": 65536 00:12:27.402 }, 00:12:27.402 { 00:12:27.402 "name": "BaseBdev2", 00:12:27.402 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:27.402 "is_configured": true, 00:12:27.402 "data_offset": 0, 00:12:27.402 "data_size": 65536 00:12:27.402 }, 00:12:27.402 { 00:12:27.402 "name": "BaseBdev3", 00:12:27.402 "uuid": "77e1bfa9-e588-4c1d-860c-799d0b1d056a", 00:12:27.402 "is_configured": true, 00:12:27.402 "data_offset": 0, 00:12:27.402 "data_size": 65536 00:12:27.402 } 00:12:27.402 ] 00:12:27.402 }' 00:12:27.402 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.402 10:07:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:28.007 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:28.268 [2024-06-10 10:07:49.877528] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:28.268 "name": "Existed_Raid", 00:12:28.268 "aliases": [ 00:12:28.268 "a0d4430b-d9e7-4eb8-a409-059c94c185be" 00:12:28.268 ], 00:12:28.268 "product_name": "Raid Volume", 00:12:28.268 "block_size": 512, 00:12:28.268 "num_blocks": 196608, 00:12:28.268 "uuid": "a0d4430b-d9e7-4eb8-a409-059c94c185be", 00:12:28.268 "assigned_rate_limits": { 00:12:28.268 "rw_ios_per_sec": 0, 00:12:28.268 "rw_mbytes_per_sec": 0, 00:12:28.268 "r_mbytes_per_sec": 0, 00:12:28.268 "w_mbytes_per_sec": 0 00:12:28.268 }, 00:12:28.268 "claimed": false, 00:12:28.268 "zoned": false, 00:12:28.268 "supported_io_types": { 00:12:28.268 "read": true, 00:12:28.268 "write": true, 00:12:28.268 "unmap": true, 00:12:28.268 "write_zeroes": true, 00:12:28.268 "flush": true, 00:12:28.268 "reset": true, 00:12:28.268 "compare": false, 00:12:28.268 "compare_and_write": false, 00:12:28.268 "abort": false, 00:12:28.268 "nvme_admin": false, 00:12:28.268 "nvme_io": false 00:12:28.268 }, 00:12:28.268 "memory_domains": [ 00:12:28.268 { 00:12:28.268 "dma_device_id": "system", 00:12:28.268 "dma_device_type": 1 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.268 "dma_device_type": 2 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "dma_device_id": "system", 00:12:28.268 "dma_device_type": 1 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.268 "dma_device_type": 2 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "dma_device_id": "system", 00:12:28.268 "dma_device_type": 1 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.268 "dma_device_type": 2 00:12:28.268 } 00:12:28.268 ], 00:12:28.268 "driver_specific": { 00:12:28.268 "raid": { 00:12:28.268 "uuid": "a0d4430b-d9e7-4eb8-a409-059c94c185be", 00:12:28.268 "strip_size_kb": 64, 00:12:28.268 "state": "online", 00:12:28.268 "raid_level": "raid0", 00:12:28.268 "superblock": false, 00:12:28.268 "num_base_bdevs": 3, 00:12:28.268 "num_base_bdevs_discovered": 3, 00:12:28.268 "num_base_bdevs_operational": 3, 00:12:28.268 "base_bdevs_list": [ 00:12:28.268 { 00:12:28.268 "name": "BaseBdev1", 00:12:28.268 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:28.268 "is_configured": true, 00:12:28.268 "data_offset": 0, 00:12:28.268 "data_size": 65536 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "name": "BaseBdev2", 00:12:28.268 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:28.268 "is_configured": true, 00:12:28.268 "data_offset": 0, 00:12:28.268 "data_size": 65536 00:12:28.268 }, 00:12:28.268 { 00:12:28.268 "name": "BaseBdev3", 00:12:28.268 "uuid": "77e1bfa9-e588-4c1d-860c-799d0b1d056a", 00:12:28.268 "is_configured": true, 00:12:28.268 "data_offset": 0, 00:12:28.268 "data_size": 65536 00:12:28.268 } 00:12:28.268 ] 00:12:28.268 } 00:12:28.268 } 00:12:28.268 }' 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:28.268 BaseBdev2 00:12:28.268 BaseBdev3' 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:28.268 10:07:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:28.528 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:28.528 "name": "BaseBdev1", 00:12:28.528 "aliases": [ 00:12:28.528 "a59bd615-0fb8-45ec-8503-b0cd7d256ab4" 00:12:28.528 ], 00:12:28.528 "product_name": "Malloc disk", 00:12:28.528 "block_size": 512, 00:12:28.528 "num_blocks": 65536, 00:12:28.528 "uuid": "a59bd615-0fb8-45ec-8503-b0cd7d256ab4", 00:12:28.528 "assigned_rate_limits": { 00:12:28.528 "rw_ios_per_sec": 0, 00:12:28.528 "rw_mbytes_per_sec": 0, 00:12:28.528 "r_mbytes_per_sec": 0, 00:12:28.528 "w_mbytes_per_sec": 0 00:12:28.528 }, 00:12:28.528 "claimed": true, 00:12:28.528 "claim_type": "exclusive_write", 00:12:28.528 "zoned": false, 00:12:28.529 "supported_io_types": { 00:12:28.529 "read": true, 00:12:28.529 "write": true, 00:12:28.529 "unmap": true, 00:12:28.529 "write_zeroes": true, 00:12:28.529 "flush": true, 00:12:28.529 "reset": true, 00:12:28.529 "compare": false, 00:12:28.529 "compare_and_write": false, 00:12:28.529 "abort": true, 00:12:28.529 "nvme_admin": false, 00:12:28.529 "nvme_io": false 00:12:28.529 }, 00:12:28.529 "memory_domains": [ 00:12:28.529 { 00:12:28.529 "dma_device_id": "system", 00:12:28.529 "dma_device_type": 1 00:12:28.529 }, 00:12:28.529 { 00:12:28.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.529 "dma_device_type": 2 00:12:28.529 } 00:12:28.529 ], 00:12:28.529 "driver_specific": {} 00:12:28.529 }' 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.529 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:28.788 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:29.049 "name": "BaseBdev2", 00:12:29.049 "aliases": [ 00:12:29.049 "dde59b0c-77a5-4c13-92bd-c11ee9f7f627" 00:12:29.049 ], 00:12:29.049 "product_name": "Malloc disk", 00:12:29.049 "block_size": 512, 00:12:29.049 "num_blocks": 65536, 00:12:29.049 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:29.049 "assigned_rate_limits": { 00:12:29.049 "rw_ios_per_sec": 0, 00:12:29.049 "rw_mbytes_per_sec": 0, 00:12:29.049 "r_mbytes_per_sec": 0, 00:12:29.049 "w_mbytes_per_sec": 0 00:12:29.049 }, 00:12:29.049 "claimed": true, 00:12:29.049 "claim_type": "exclusive_write", 00:12:29.049 "zoned": false, 00:12:29.049 "supported_io_types": { 00:12:29.049 "read": true, 00:12:29.049 "write": true, 00:12:29.049 "unmap": true, 00:12:29.049 "write_zeroes": true, 00:12:29.049 "flush": true, 00:12:29.049 "reset": true, 00:12:29.049 "compare": false, 00:12:29.049 "compare_and_write": false, 00:12:29.049 "abort": true, 00:12:29.049 "nvme_admin": false, 00:12:29.049 "nvme_io": false 00:12:29.049 }, 00:12:29.049 "memory_domains": [ 00:12:29.049 { 00:12:29.049 "dma_device_id": "system", 00:12:29.049 "dma_device_type": 1 00:12:29.049 }, 00:12:29.049 { 00:12:29.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.049 "dma_device_type": 2 00:12:29.049 } 00:12:29.049 ], 00:12:29.049 "driver_specific": {} 00:12:29.049 }' 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:29.049 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.309 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.309 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:29.309 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.309 10:07:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.309 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.309 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:29.309 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:29.309 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:29.569 "name": "BaseBdev3", 00:12:29.569 "aliases": [ 00:12:29.569 "77e1bfa9-e588-4c1d-860c-799d0b1d056a" 00:12:29.569 ], 00:12:29.569 "product_name": "Malloc disk", 00:12:29.569 "block_size": 512, 00:12:29.569 "num_blocks": 65536, 00:12:29.569 "uuid": "77e1bfa9-e588-4c1d-860c-799d0b1d056a", 00:12:29.569 "assigned_rate_limits": { 00:12:29.569 "rw_ios_per_sec": 0, 00:12:29.569 "rw_mbytes_per_sec": 0, 00:12:29.569 "r_mbytes_per_sec": 0, 00:12:29.569 "w_mbytes_per_sec": 0 00:12:29.569 }, 00:12:29.569 "claimed": true, 00:12:29.569 "claim_type": "exclusive_write", 00:12:29.569 "zoned": false, 00:12:29.569 "supported_io_types": { 00:12:29.569 "read": true, 00:12:29.569 "write": true, 00:12:29.569 "unmap": true, 00:12:29.569 "write_zeroes": true, 00:12:29.569 "flush": true, 00:12:29.569 "reset": true, 00:12:29.569 "compare": false, 00:12:29.569 "compare_and_write": false, 00:12:29.569 "abort": true, 00:12:29.569 "nvme_admin": false, 00:12:29.569 "nvme_io": false 00:12:29.569 }, 00:12:29.569 "memory_domains": [ 00:12:29.569 { 00:12:29.569 "dma_device_id": "system", 00:12:29.569 "dma_device_type": 1 00:12:29.569 }, 00:12:29.569 { 00:12:29.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.569 "dma_device_type": 2 00:12:29.569 } 00:12:29.569 ], 00:12:29.569 "driver_specific": {} 00:12:29.569 }' 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:29.569 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.828 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:30.088 [2024-06-10 10:07:51.730022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:30.088 [2024-06-10 10:07:51.730039] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:30.088 [2024-06-10 10:07:51.730069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.088 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.088 "name": "Existed_Raid", 00:12:30.088 "uuid": "a0d4430b-d9e7-4eb8-a409-059c94c185be", 00:12:30.088 "strip_size_kb": 64, 00:12:30.088 "state": "offline", 00:12:30.088 "raid_level": "raid0", 00:12:30.088 "superblock": false, 00:12:30.088 "num_base_bdevs": 3, 00:12:30.088 "num_base_bdevs_discovered": 2, 00:12:30.088 "num_base_bdevs_operational": 2, 00:12:30.088 "base_bdevs_list": [ 00:12:30.088 { 00:12:30.088 "name": null, 00:12:30.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.088 "is_configured": false, 00:12:30.088 "data_offset": 0, 00:12:30.088 "data_size": 65536 00:12:30.088 }, 00:12:30.088 { 00:12:30.088 "name": "BaseBdev2", 00:12:30.089 "uuid": "dde59b0c-77a5-4c13-92bd-c11ee9f7f627", 00:12:30.089 "is_configured": true, 00:12:30.089 "data_offset": 0, 00:12:30.089 "data_size": 65536 00:12:30.089 }, 00:12:30.089 { 00:12:30.089 "name": "BaseBdev3", 00:12:30.089 "uuid": "77e1bfa9-e588-4c1d-860c-799d0b1d056a", 00:12:30.089 "is_configured": true, 00:12:30.089 "data_offset": 0, 00:12:30.089 "data_size": 65536 00:12:30.089 } 00:12:30.089 ] 00:12:30.089 }' 00:12:30.089 10:07:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.089 10:07:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.658 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:30.658 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:30.658 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.658 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:30.918 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:30.918 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:30.918 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:31.178 [2024-06-10 10:07:52.860905] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:31.178 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:31.178 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:31.178 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.178 10:07:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:31.438 [2024-06-10 10:07:53.247717] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:31.438 [2024-06-10 10:07:53.247745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12722c0 name Existed_Raid, state offline 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.438 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:31.698 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:31.958 BaseBdev2 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:31.958 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:32.218 10:07:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:32.218 [ 00:12:32.218 { 00:12:32.218 "name": "BaseBdev2", 00:12:32.218 "aliases": [ 00:12:32.218 "dc2409f7-d4d3-413c-a5c6-450803830177" 00:12:32.218 ], 00:12:32.218 "product_name": "Malloc disk", 00:12:32.218 "block_size": 512, 00:12:32.218 "num_blocks": 65536, 00:12:32.218 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:32.218 "assigned_rate_limits": { 00:12:32.218 "rw_ios_per_sec": 0, 00:12:32.218 "rw_mbytes_per_sec": 0, 00:12:32.218 "r_mbytes_per_sec": 0, 00:12:32.218 "w_mbytes_per_sec": 0 00:12:32.218 }, 00:12:32.218 "claimed": false, 00:12:32.218 "zoned": false, 00:12:32.218 "supported_io_types": { 00:12:32.218 "read": true, 00:12:32.218 "write": true, 00:12:32.218 "unmap": true, 00:12:32.218 "write_zeroes": true, 00:12:32.218 "flush": true, 00:12:32.218 "reset": true, 00:12:32.218 "compare": false, 00:12:32.218 "compare_and_write": false, 00:12:32.218 "abort": true, 00:12:32.218 "nvme_admin": false, 00:12:32.218 "nvme_io": false 00:12:32.218 }, 00:12:32.218 "memory_domains": [ 00:12:32.218 { 00:12:32.218 "dma_device_id": "system", 00:12:32.218 "dma_device_type": 1 00:12:32.218 }, 00:12:32.218 { 00:12:32.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.218 "dma_device_type": 2 00:12:32.218 } 00:12:32.218 ], 00:12:32.218 "driver_specific": {} 00:12:32.218 } 00:12:32.218 ] 00:12:32.218 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:32.218 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:32.218 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:32.218 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:32.477 BaseBdev3 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:32.478 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:32.738 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:32.738 [ 00:12:32.738 { 00:12:32.738 "name": "BaseBdev3", 00:12:32.738 "aliases": [ 00:12:32.738 "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5" 00:12:32.738 ], 00:12:32.738 "product_name": "Malloc disk", 00:12:32.738 "block_size": 512, 00:12:32.738 "num_blocks": 65536, 00:12:32.738 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:32.738 "assigned_rate_limits": { 00:12:32.738 "rw_ios_per_sec": 0, 00:12:32.738 "rw_mbytes_per_sec": 0, 00:12:32.738 "r_mbytes_per_sec": 0, 00:12:32.738 "w_mbytes_per_sec": 0 00:12:32.738 }, 00:12:32.738 "claimed": false, 00:12:32.738 "zoned": false, 00:12:32.738 "supported_io_types": { 00:12:32.738 "read": true, 00:12:32.738 "write": true, 00:12:32.738 "unmap": true, 00:12:32.738 "write_zeroes": true, 00:12:32.738 "flush": true, 00:12:32.738 "reset": true, 00:12:32.738 "compare": false, 00:12:32.738 "compare_and_write": false, 00:12:32.738 "abort": true, 00:12:32.738 "nvme_admin": false, 00:12:32.738 "nvme_io": false 00:12:32.738 }, 00:12:32.738 "memory_domains": [ 00:12:32.738 { 00:12:32.738 "dma_device_id": "system", 00:12:32.738 "dma_device_type": 1 00:12:32.738 }, 00:12:32.738 { 00:12:32.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.738 "dma_device_type": 2 00:12:32.738 } 00:12:32.738 ], 00:12:32.738 "driver_specific": {} 00:12:32.738 } 00:12:32.738 ] 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:32.998 [2024-06-10 10:07:54.783527] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:32.998 [2024-06-10 10:07:54.783555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:32.998 [2024-06-10 10:07:54.783567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:32.998 [2024-06-10 10:07:54.784600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.998 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.258 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.258 "name": "Existed_Raid", 00:12:33.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.258 "strip_size_kb": 64, 00:12:33.258 "state": "configuring", 00:12:33.258 "raid_level": "raid0", 00:12:33.258 "superblock": false, 00:12:33.258 "num_base_bdevs": 3, 00:12:33.258 "num_base_bdevs_discovered": 2, 00:12:33.258 "num_base_bdevs_operational": 3, 00:12:33.258 "base_bdevs_list": [ 00:12:33.258 { 00:12:33.258 "name": "BaseBdev1", 00:12:33.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.258 "is_configured": false, 00:12:33.258 "data_offset": 0, 00:12:33.258 "data_size": 0 00:12:33.258 }, 00:12:33.258 { 00:12:33.258 "name": "BaseBdev2", 00:12:33.258 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:33.258 "is_configured": true, 00:12:33.258 "data_offset": 0, 00:12:33.258 "data_size": 65536 00:12:33.258 }, 00:12:33.258 { 00:12:33.258 "name": "BaseBdev3", 00:12:33.258 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:33.258 "is_configured": true, 00:12:33.258 "data_offset": 0, 00:12:33.258 "data_size": 65536 00:12:33.258 } 00:12:33.258 ] 00:12:33.258 }' 00:12:33.258 10:07:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.258 10:07:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:33.828 [2024-06-10 10:07:55.673758] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.828 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.088 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.088 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.088 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.088 "name": "Existed_Raid", 00:12:34.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.088 "strip_size_kb": 64, 00:12:34.088 "state": "configuring", 00:12:34.088 "raid_level": "raid0", 00:12:34.088 "superblock": false, 00:12:34.088 "num_base_bdevs": 3, 00:12:34.088 "num_base_bdevs_discovered": 1, 00:12:34.088 "num_base_bdevs_operational": 3, 00:12:34.088 "base_bdevs_list": [ 00:12:34.088 { 00:12:34.088 "name": "BaseBdev1", 00:12:34.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.088 "is_configured": false, 00:12:34.088 "data_offset": 0, 00:12:34.088 "data_size": 0 00:12:34.088 }, 00:12:34.088 { 00:12:34.088 "name": null, 00:12:34.088 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:34.088 "is_configured": false, 00:12:34.088 "data_offset": 0, 00:12:34.088 "data_size": 65536 00:12:34.088 }, 00:12:34.088 { 00:12:34.088 "name": "BaseBdev3", 00:12:34.088 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:34.088 "is_configured": true, 00:12:34.088 "data_offset": 0, 00:12:34.088 "data_size": 65536 00:12:34.088 } 00:12:34.088 ] 00:12:34.088 }' 00:12:34.088 10:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.088 10:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.658 10:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.658 10:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:34.919 10:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:34.919 10:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:35.179 [2024-06-10 10:07:56.793510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:35.179 BaseBdev1 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:35.179 10:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:35.439 [ 00:12:35.439 { 00:12:35.439 "name": "BaseBdev1", 00:12:35.439 "aliases": [ 00:12:35.439 "ef5afc37-0aa0-4f33-be62-e681e692a954" 00:12:35.439 ], 00:12:35.439 "product_name": "Malloc disk", 00:12:35.439 "block_size": 512, 00:12:35.439 "num_blocks": 65536, 00:12:35.439 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:35.439 "assigned_rate_limits": { 00:12:35.439 "rw_ios_per_sec": 0, 00:12:35.439 "rw_mbytes_per_sec": 0, 00:12:35.439 "r_mbytes_per_sec": 0, 00:12:35.439 "w_mbytes_per_sec": 0 00:12:35.439 }, 00:12:35.439 "claimed": true, 00:12:35.439 "claim_type": "exclusive_write", 00:12:35.439 "zoned": false, 00:12:35.439 "supported_io_types": { 00:12:35.439 "read": true, 00:12:35.439 "write": true, 00:12:35.439 "unmap": true, 00:12:35.439 "write_zeroes": true, 00:12:35.439 "flush": true, 00:12:35.439 "reset": true, 00:12:35.439 "compare": false, 00:12:35.439 "compare_and_write": false, 00:12:35.439 "abort": true, 00:12:35.439 "nvme_admin": false, 00:12:35.439 "nvme_io": false 00:12:35.439 }, 00:12:35.439 "memory_domains": [ 00:12:35.439 { 00:12:35.439 "dma_device_id": "system", 00:12:35.439 "dma_device_type": 1 00:12:35.439 }, 00:12:35.439 { 00:12:35.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.439 "dma_device_type": 2 00:12:35.439 } 00:12:35.439 ], 00:12:35.439 "driver_specific": {} 00:12:35.439 } 00:12:35.439 ] 00:12:35.439 10:07:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.440 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.699 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.699 "name": "Existed_Raid", 00:12:35.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.699 "strip_size_kb": 64, 00:12:35.699 "state": "configuring", 00:12:35.699 "raid_level": "raid0", 00:12:35.699 "superblock": false, 00:12:35.699 "num_base_bdevs": 3, 00:12:35.699 "num_base_bdevs_discovered": 2, 00:12:35.699 "num_base_bdevs_operational": 3, 00:12:35.699 "base_bdevs_list": [ 00:12:35.699 { 00:12:35.699 "name": "BaseBdev1", 00:12:35.699 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:35.699 "is_configured": true, 00:12:35.699 "data_offset": 0, 00:12:35.699 "data_size": 65536 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "name": null, 00:12:35.699 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:35.699 "is_configured": false, 00:12:35.699 "data_offset": 0, 00:12:35.699 "data_size": 65536 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "name": "BaseBdev3", 00:12:35.699 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:35.699 "is_configured": true, 00:12:35.699 "data_offset": 0, 00:12:35.700 "data_size": 65536 00:12:35.700 } 00:12:35.700 ] 00:12:35.700 }' 00:12:35.700 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.700 10:07:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.269 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.269 10:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:36.269 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:36.269 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:36.529 [2024-06-10 10:07:58.265261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.529 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.789 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.789 "name": "Existed_Raid", 00:12:36.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.789 "strip_size_kb": 64, 00:12:36.789 "state": "configuring", 00:12:36.789 "raid_level": "raid0", 00:12:36.789 "superblock": false, 00:12:36.789 "num_base_bdevs": 3, 00:12:36.789 "num_base_bdevs_discovered": 1, 00:12:36.789 "num_base_bdevs_operational": 3, 00:12:36.789 "base_bdevs_list": [ 00:12:36.789 { 00:12:36.789 "name": "BaseBdev1", 00:12:36.789 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:36.789 "is_configured": true, 00:12:36.789 "data_offset": 0, 00:12:36.789 "data_size": 65536 00:12:36.789 }, 00:12:36.789 { 00:12:36.789 "name": null, 00:12:36.789 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:36.789 "is_configured": false, 00:12:36.789 "data_offset": 0, 00:12:36.789 "data_size": 65536 00:12:36.789 }, 00:12:36.789 { 00:12:36.789 "name": null, 00:12:36.789 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:36.789 "is_configured": false, 00:12:36.789 "data_offset": 0, 00:12:36.789 "data_size": 65536 00:12:36.789 } 00:12:36.789 ] 00:12:36.789 }' 00:12:36.789 10:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.789 10:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.359 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:37.359 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.359 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:37.359 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:37.619 [2024-06-10 10:07:59.348008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.619 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.879 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.879 "name": "Existed_Raid", 00:12:37.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.879 "strip_size_kb": 64, 00:12:37.879 "state": "configuring", 00:12:37.879 "raid_level": "raid0", 00:12:37.879 "superblock": false, 00:12:37.879 "num_base_bdevs": 3, 00:12:37.879 "num_base_bdevs_discovered": 2, 00:12:37.879 "num_base_bdevs_operational": 3, 00:12:37.879 "base_bdevs_list": [ 00:12:37.879 { 00:12:37.879 "name": "BaseBdev1", 00:12:37.879 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:37.879 "is_configured": true, 00:12:37.879 "data_offset": 0, 00:12:37.879 "data_size": 65536 00:12:37.879 }, 00:12:37.879 { 00:12:37.879 "name": null, 00:12:37.879 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:37.879 "is_configured": false, 00:12:37.879 "data_offset": 0, 00:12:37.879 "data_size": 65536 00:12:37.879 }, 00:12:37.879 { 00:12:37.879 "name": "BaseBdev3", 00:12:37.879 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:37.879 "is_configured": true, 00:12:37.879 "data_offset": 0, 00:12:37.879 "data_size": 65536 00:12:37.879 } 00:12:37.879 ] 00:12:37.879 }' 00:12:37.879 10:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.879 10:07:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.454 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.454 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:38.454 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:38.454 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:38.714 [2024-06-10 10:08:00.462837] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.714 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.975 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.975 "name": "Existed_Raid", 00:12:38.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.975 "strip_size_kb": 64, 00:12:38.975 "state": "configuring", 00:12:38.975 "raid_level": "raid0", 00:12:38.975 "superblock": false, 00:12:38.975 "num_base_bdevs": 3, 00:12:38.975 "num_base_bdevs_discovered": 1, 00:12:38.975 "num_base_bdevs_operational": 3, 00:12:38.975 "base_bdevs_list": [ 00:12:38.975 { 00:12:38.975 "name": null, 00:12:38.975 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:38.975 "is_configured": false, 00:12:38.975 "data_offset": 0, 00:12:38.975 "data_size": 65536 00:12:38.975 }, 00:12:38.975 { 00:12:38.975 "name": null, 00:12:38.975 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:38.975 "is_configured": false, 00:12:38.975 "data_offset": 0, 00:12:38.975 "data_size": 65536 00:12:38.975 }, 00:12:38.975 { 00:12:38.975 "name": "BaseBdev3", 00:12:38.975 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:38.975 "is_configured": true, 00:12:38.975 "data_offset": 0, 00:12:38.975 "data_size": 65536 00:12:38.975 } 00:12:38.975 ] 00:12:38.975 }' 00:12:38.975 10:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.975 10:08:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.546 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.546 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:39.807 [2024-06-10 10:08:01.587511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.807 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.067 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.067 "name": "Existed_Raid", 00:12:40.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:40.067 "strip_size_kb": 64, 00:12:40.067 "state": "configuring", 00:12:40.067 "raid_level": "raid0", 00:12:40.067 "superblock": false, 00:12:40.067 "num_base_bdevs": 3, 00:12:40.067 "num_base_bdevs_discovered": 2, 00:12:40.067 "num_base_bdevs_operational": 3, 00:12:40.067 "base_bdevs_list": [ 00:12:40.067 { 00:12:40.067 "name": null, 00:12:40.067 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:40.067 "is_configured": false, 00:12:40.067 "data_offset": 0, 00:12:40.067 "data_size": 65536 00:12:40.067 }, 00:12:40.067 { 00:12:40.067 "name": "BaseBdev2", 00:12:40.067 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:40.067 "is_configured": true, 00:12:40.067 "data_offset": 0, 00:12:40.067 "data_size": 65536 00:12:40.067 }, 00:12:40.067 { 00:12:40.067 "name": "BaseBdev3", 00:12:40.067 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:40.067 "is_configured": true, 00:12:40.067 "data_offset": 0, 00:12:40.067 "data_size": 65536 00:12:40.067 } 00:12:40.067 ] 00:12:40.067 }' 00:12:40.067 10:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.067 10:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.636 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.636 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:40.895 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:40.895 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.895 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:40.895 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ef5afc37-0aa0-4f33-be62-e681e692a954 00:12:41.156 [2024-06-10 10:08:02.879762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:41.156 [2024-06-10 10:08:02.879786] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12726b0 00:12:41.156 [2024-06-10 10:08:02.879790] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:41.156 [2024-06-10 10:08:02.879941] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14165c0 00:12:41.156 [2024-06-10 10:08:02.880030] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12726b0 00:12:41.156 [2024-06-10 10:08:02.880035] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12726b0 00:12:41.156 [2024-06-10 10:08:02.880152] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:41.156 NewBaseBdev 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:41.156 10:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:41.417 [ 00:12:41.417 { 00:12:41.417 "name": "NewBaseBdev", 00:12:41.417 "aliases": [ 00:12:41.417 "ef5afc37-0aa0-4f33-be62-e681e692a954" 00:12:41.417 ], 00:12:41.417 "product_name": "Malloc disk", 00:12:41.417 "block_size": 512, 00:12:41.417 "num_blocks": 65536, 00:12:41.417 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:41.417 "assigned_rate_limits": { 00:12:41.417 "rw_ios_per_sec": 0, 00:12:41.417 "rw_mbytes_per_sec": 0, 00:12:41.417 "r_mbytes_per_sec": 0, 00:12:41.417 "w_mbytes_per_sec": 0 00:12:41.417 }, 00:12:41.417 "claimed": true, 00:12:41.417 "claim_type": "exclusive_write", 00:12:41.417 "zoned": false, 00:12:41.417 "supported_io_types": { 00:12:41.417 "read": true, 00:12:41.417 "write": true, 00:12:41.417 "unmap": true, 00:12:41.417 "write_zeroes": true, 00:12:41.417 "flush": true, 00:12:41.417 "reset": true, 00:12:41.417 "compare": false, 00:12:41.417 "compare_and_write": false, 00:12:41.417 "abort": true, 00:12:41.417 "nvme_admin": false, 00:12:41.417 "nvme_io": false 00:12:41.417 }, 00:12:41.417 "memory_domains": [ 00:12:41.417 { 00:12:41.417 "dma_device_id": "system", 00:12:41.417 "dma_device_type": 1 00:12:41.417 }, 00:12:41.417 { 00:12:41.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.417 "dma_device_type": 2 00:12:41.417 } 00:12:41.417 ], 00:12:41.417 "driver_specific": {} 00:12:41.417 } 00:12:41.417 ] 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.417 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.677 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.677 "name": "Existed_Raid", 00:12:41.677 "uuid": "5f1c34fa-fd6d-4c4d-9084-eac4af5bbe1d", 00:12:41.677 "strip_size_kb": 64, 00:12:41.677 "state": "online", 00:12:41.677 "raid_level": "raid0", 00:12:41.677 "superblock": false, 00:12:41.677 "num_base_bdevs": 3, 00:12:41.677 "num_base_bdevs_discovered": 3, 00:12:41.677 "num_base_bdevs_operational": 3, 00:12:41.677 "base_bdevs_list": [ 00:12:41.677 { 00:12:41.677 "name": "NewBaseBdev", 00:12:41.677 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:41.677 "is_configured": true, 00:12:41.677 "data_offset": 0, 00:12:41.677 "data_size": 65536 00:12:41.677 }, 00:12:41.677 { 00:12:41.677 "name": "BaseBdev2", 00:12:41.677 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:41.677 "is_configured": true, 00:12:41.677 "data_offset": 0, 00:12:41.677 "data_size": 65536 00:12:41.677 }, 00:12:41.677 { 00:12:41.677 "name": "BaseBdev3", 00:12:41.677 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:41.677 "is_configured": true, 00:12:41.677 "data_offset": 0, 00:12:41.677 "data_size": 65536 00:12:41.677 } 00:12:41.677 ] 00:12:41.677 }' 00:12:41.677 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.677 10:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:42.277 10:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:42.277 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:42.277 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:42.562 [2024-06-10 10:08:04.179261] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.562 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:42.562 "name": "Existed_Raid", 00:12:42.562 "aliases": [ 00:12:42.562 "5f1c34fa-fd6d-4c4d-9084-eac4af5bbe1d" 00:12:42.562 ], 00:12:42.562 "product_name": "Raid Volume", 00:12:42.562 "block_size": 512, 00:12:42.562 "num_blocks": 196608, 00:12:42.562 "uuid": "5f1c34fa-fd6d-4c4d-9084-eac4af5bbe1d", 00:12:42.562 "assigned_rate_limits": { 00:12:42.562 "rw_ios_per_sec": 0, 00:12:42.562 "rw_mbytes_per_sec": 0, 00:12:42.562 "r_mbytes_per_sec": 0, 00:12:42.562 "w_mbytes_per_sec": 0 00:12:42.562 }, 00:12:42.562 "claimed": false, 00:12:42.562 "zoned": false, 00:12:42.562 "supported_io_types": { 00:12:42.562 "read": true, 00:12:42.562 "write": true, 00:12:42.562 "unmap": true, 00:12:42.562 "write_zeroes": true, 00:12:42.562 "flush": true, 00:12:42.562 "reset": true, 00:12:42.562 "compare": false, 00:12:42.562 "compare_and_write": false, 00:12:42.562 "abort": false, 00:12:42.562 "nvme_admin": false, 00:12:42.562 "nvme_io": false 00:12:42.562 }, 00:12:42.562 "memory_domains": [ 00:12:42.562 { 00:12:42.562 "dma_device_id": "system", 00:12:42.562 "dma_device_type": 1 00:12:42.562 }, 00:12:42.562 { 00:12:42.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.562 "dma_device_type": 2 00:12:42.562 }, 00:12:42.562 { 00:12:42.562 "dma_device_id": "system", 00:12:42.562 "dma_device_type": 1 00:12:42.562 }, 00:12:42.562 { 00:12:42.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.562 "dma_device_type": 2 00:12:42.562 }, 00:12:42.563 { 00:12:42.563 "dma_device_id": "system", 00:12:42.563 "dma_device_type": 1 00:12:42.563 }, 00:12:42.563 { 00:12:42.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.563 "dma_device_type": 2 00:12:42.563 } 00:12:42.563 ], 00:12:42.563 "driver_specific": { 00:12:42.563 "raid": { 00:12:42.563 "uuid": "5f1c34fa-fd6d-4c4d-9084-eac4af5bbe1d", 00:12:42.563 "strip_size_kb": 64, 00:12:42.563 "state": "online", 00:12:42.563 "raid_level": "raid0", 00:12:42.563 "superblock": false, 00:12:42.563 "num_base_bdevs": 3, 00:12:42.563 "num_base_bdevs_discovered": 3, 00:12:42.563 "num_base_bdevs_operational": 3, 00:12:42.563 "base_bdevs_list": [ 00:12:42.563 { 00:12:42.563 "name": "NewBaseBdev", 00:12:42.563 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:42.563 "is_configured": true, 00:12:42.563 "data_offset": 0, 00:12:42.563 "data_size": 65536 00:12:42.563 }, 00:12:42.563 { 00:12:42.563 "name": "BaseBdev2", 00:12:42.563 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:42.563 "is_configured": true, 00:12:42.563 "data_offset": 0, 00:12:42.563 "data_size": 65536 00:12:42.563 }, 00:12:42.563 { 00:12:42.563 "name": "BaseBdev3", 00:12:42.563 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:42.563 "is_configured": true, 00:12:42.563 "data_offset": 0, 00:12:42.563 "data_size": 65536 00:12:42.563 } 00:12:42.563 ] 00:12:42.563 } 00:12:42.563 } 00:12:42.563 }' 00:12:42.563 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:42.563 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:42.563 BaseBdev2 00:12:42.563 BaseBdev3' 00:12:42.563 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.563 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:42.563 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.823 "name": "NewBaseBdev", 00:12:42.823 "aliases": [ 00:12:42.823 "ef5afc37-0aa0-4f33-be62-e681e692a954" 00:12:42.823 ], 00:12:42.823 "product_name": "Malloc disk", 00:12:42.823 "block_size": 512, 00:12:42.823 "num_blocks": 65536, 00:12:42.823 "uuid": "ef5afc37-0aa0-4f33-be62-e681e692a954", 00:12:42.823 "assigned_rate_limits": { 00:12:42.823 "rw_ios_per_sec": 0, 00:12:42.823 "rw_mbytes_per_sec": 0, 00:12:42.823 "r_mbytes_per_sec": 0, 00:12:42.823 "w_mbytes_per_sec": 0 00:12:42.823 }, 00:12:42.823 "claimed": true, 00:12:42.823 "claim_type": "exclusive_write", 00:12:42.823 "zoned": false, 00:12:42.823 "supported_io_types": { 00:12:42.823 "read": true, 00:12:42.823 "write": true, 00:12:42.823 "unmap": true, 00:12:42.823 "write_zeroes": true, 00:12:42.823 "flush": true, 00:12:42.823 "reset": true, 00:12:42.823 "compare": false, 00:12:42.823 "compare_and_write": false, 00:12:42.823 "abort": true, 00:12:42.823 "nvme_admin": false, 00:12:42.823 "nvme_io": false 00:12:42.823 }, 00:12:42.823 "memory_domains": [ 00:12:42.823 { 00:12:42.823 "dma_device_id": "system", 00:12:42.823 "dma_device_type": 1 00:12:42.823 }, 00:12:42.823 { 00:12:42.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.823 "dma_device_type": 2 00:12:42.823 } 00:12:42.823 ], 00:12:42.823 "driver_specific": {} 00:12:42.823 }' 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.823 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:43.083 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.343 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.343 "name": "BaseBdev2", 00:12:43.343 "aliases": [ 00:12:43.343 "dc2409f7-d4d3-413c-a5c6-450803830177" 00:12:43.343 ], 00:12:43.343 "product_name": "Malloc disk", 00:12:43.343 "block_size": 512, 00:12:43.343 "num_blocks": 65536, 00:12:43.343 "uuid": "dc2409f7-d4d3-413c-a5c6-450803830177", 00:12:43.343 "assigned_rate_limits": { 00:12:43.343 "rw_ios_per_sec": 0, 00:12:43.343 "rw_mbytes_per_sec": 0, 00:12:43.343 "r_mbytes_per_sec": 0, 00:12:43.343 "w_mbytes_per_sec": 0 00:12:43.343 }, 00:12:43.343 "claimed": true, 00:12:43.343 "claim_type": "exclusive_write", 00:12:43.343 "zoned": false, 00:12:43.343 "supported_io_types": { 00:12:43.343 "read": true, 00:12:43.343 "write": true, 00:12:43.343 "unmap": true, 00:12:43.343 "write_zeroes": true, 00:12:43.343 "flush": true, 00:12:43.343 "reset": true, 00:12:43.343 "compare": false, 00:12:43.343 "compare_and_write": false, 00:12:43.343 "abort": true, 00:12:43.343 "nvme_admin": false, 00:12:43.343 "nvme_io": false 00:12:43.343 }, 00:12:43.343 "memory_domains": [ 00:12:43.343 { 00:12:43.343 "dma_device_id": "system", 00:12:43.343 "dma_device_type": 1 00:12:43.343 }, 00:12:43.343 { 00:12:43.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.343 "dma_device_type": 2 00:12:43.343 } 00:12:43.343 ], 00:12:43.343 "driver_specific": {} 00:12:43.343 }' 00:12:43.343 10:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.343 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:43.604 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.864 "name": "BaseBdev3", 00:12:43.864 "aliases": [ 00:12:43.864 "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5" 00:12:43.864 ], 00:12:43.864 "product_name": "Malloc disk", 00:12:43.864 "block_size": 512, 00:12:43.864 "num_blocks": 65536, 00:12:43.864 "uuid": "17478ca6-f5d8-4c1b-a435-4fdfe91d3ab5", 00:12:43.864 "assigned_rate_limits": { 00:12:43.864 "rw_ios_per_sec": 0, 00:12:43.864 "rw_mbytes_per_sec": 0, 00:12:43.864 "r_mbytes_per_sec": 0, 00:12:43.864 "w_mbytes_per_sec": 0 00:12:43.864 }, 00:12:43.864 "claimed": true, 00:12:43.864 "claim_type": "exclusive_write", 00:12:43.864 "zoned": false, 00:12:43.864 "supported_io_types": { 00:12:43.864 "read": true, 00:12:43.864 "write": true, 00:12:43.864 "unmap": true, 00:12:43.864 "write_zeroes": true, 00:12:43.864 "flush": true, 00:12:43.864 "reset": true, 00:12:43.864 "compare": false, 00:12:43.864 "compare_and_write": false, 00:12:43.864 "abort": true, 00:12:43.864 "nvme_admin": false, 00:12:43.864 "nvme_io": false 00:12:43.864 }, 00:12:43.864 "memory_domains": [ 00:12:43.864 { 00:12:43.864 "dma_device_id": "system", 00:12:43.864 "dma_device_type": 1 00:12:43.864 }, 00:12:43.864 { 00:12:43.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.864 "dma_device_type": 2 00:12:43.864 } 00:12:43.864 ], 00:12:43.864 "driver_specific": {} 00:12:43.864 }' 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.864 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.124 10:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:44.385 [2024-06-10 10:08:06.051802] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:44.385 [2024-06-10 10:08:06.051817] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:44.385 [2024-06-10 10:08:06.051857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:44.385 [2024-06-10 10:08:06.051895] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:44.385 [2024-06-10 10:08:06.051901] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12726b0 name Existed_Raid, state offline 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 978880 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 978880 ']' 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 978880 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 978880 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 978880' 00:12:44.385 killing process with pid 978880 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 978880 00:12:44.385 [2024-06-10 10:08:06.118083] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 978880 00:12:44.385 [2024-06-10 10:08:06.132812] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:44.385 00:12:44.385 real 0m23.684s 00:12:44.385 user 0m44.458s 00:12:44.385 sys 0m3.456s 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:44.385 10:08:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.385 ************************************ 00:12:44.385 END TEST raid_state_function_test 00:12:44.385 ************************************ 00:12:44.645 10:08:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:44.645 10:08:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:44.645 10:08:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:44.645 10:08:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:44.645 ************************************ 00:12:44.645 START TEST raid_state_function_test_sb 00:12:44.645 ************************************ 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 true 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=983470 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 983470' 00:12:44.645 Process raid pid: 983470 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 983470 /var/tmp/spdk-raid.sock 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 983470 ']' 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:44.645 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:44.645 10:08:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.645 [2024-06-10 10:08:06.396898] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:12:44.645 [2024-06-10 10:08:06.396944] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:44.645 [2024-06-10 10:08:06.483394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:44.906 [2024-06-10 10:08:06.546828] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.906 [2024-06-10 10:08:06.586076] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:44.906 [2024-06-10 10:08:06.586110] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.475 10:08:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:45.475 10:08:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:12:45.475 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:45.735 [2024-06-10 10:08:07.392966] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:45.735 [2024-06-10 10:08:07.392997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:45.735 [2024-06-10 10:08:07.393003] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:45.735 [2024-06-10 10:08:07.393009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:45.735 [2024-06-10 10:08:07.393013] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:45.735 [2024-06-10 10:08:07.393019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.735 "name": "Existed_Raid", 00:12:45.735 "uuid": "73a363f1-0a26-4bc9-bede-cef29bee772e", 00:12:45.735 "strip_size_kb": 64, 00:12:45.735 "state": "configuring", 00:12:45.735 "raid_level": "raid0", 00:12:45.735 "superblock": true, 00:12:45.735 "num_base_bdevs": 3, 00:12:45.735 "num_base_bdevs_discovered": 0, 00:12:45.735 "num_base_bdevs_operational": 3, 00:12:45.735 "base_bdevs_list": [ 00:12:45.735 { 00:12:45.735 "name": "BaseBdev1", 00:12:45.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.735 "is_configured": false, 00:12:45.735 "data_offset": 0, 00:12:45.735 "data_size": 0 00:12:45.735 }, 00:12:45.735 { 00:12:45.735 "name": "BaseBdev2", 00:12:45.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.735 "is_configured": false, 00:12:45.735 "data_offset": 0, 00:12:45.735 "data_size": 0 00:12:45.735 }, 00:12:45.735 { 00:12:45.735 "name": "BaseBdev3", 00:12:45.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.735 "is_configured": false, 00:12:45.735 "data_offset": 0, 00:12:45.735 "data_size": 0 00:12:45.735 } 00:12:45.735 ] 00:12:45.735 }' 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.735 10:08:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.305 10:08:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:46.565 [2024-06-10 10:08:08.307153] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:46.565 [2024-06-10 10:08:08.307169] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10d0b00 name Existed_Raid, state configuring 00:12:46.565 10:08:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:46.825 [2024-06-10 10:08:08.479617] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:46.825 [2024-06-10 10:08:08.479634] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:46.825 [2024-06-10 10:08:08.479639] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:46.825 [2024-06-10 10:08:08.479645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:46.825 [2024-06-10 10:08:08.479649] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:46.825 [2024-06-10 10:08:08.479654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:46.825 [2024-06-10 10:08:08.674788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:46.825 BaseBdev1 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:46.825 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:47.085 10:08:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:47.344 [ 00:12:47.344 { 00:12:47.344 "name": "BaseBdev1", 00:12:47.344 "aliases": [ 00:12:47.344 "612e4782-d58d-4ca4-9efe-8516f1c59de5" 00:12:47.344 ], 00:12:47.344 "product_name": "Malloc disk", 00:12:47.344 "block_size": 512, 00:12:47.344 "num_blocks": 65536, 00:12:47.344 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:47.344 "assigned_rate_limits": { 00:12:47.344 "rw_ios_per_sec": 0, 00:12:47.344 "rw_mbytes_per_sec": 0, 00:12:47.344 "r_mbytes_per_sec": 0, 00:12:47.344 "w_mbytes_per_sec": 0 00:12:47.344 }, 00:12:47.344 "claimed": true, 00:12:47.344 "claim_type": "exclusive_write", 00:12:47.344 "zoned": false, 00:12:47.344 "supported_io_types": { 00:12:47.344 "read": true, 00:12:47.344 "write": true, 00:12:47.344 "unmap": true, 00:12:47.344 "write_zeroes": true, 00:12:47.344 "flush": true, 00:12:47.344 "reset": true, 00:12:47.344 "compare": false, 00:12:47.344 "compare_and_write": false, 00:12:47.344 "abort": true, 00:12:47.344 "nvme_admin": false, 00:12:47.344 "nvme_io": false 00:12:47.344 }, 00:12:47.344 "memory_domains": [ 00:12:47.344 { 00:12:47.344 "dma_device_id": "system", 00:12:47.344 "dma_device_type": 1 00:12:47.344 }, 00:12:47.344 { 00:12:47.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.344 "dma_device_type": 2 00:12:47.344 } 00:12:47.344 ], 00:12:47.344 "driver_specific": {} 00:12:47.344 } 00:12:47.344 ] 00:12:47.344 10:08:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.345 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.604 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.604 "name": "Existed_Raid", 00:12:47.604 "uuid": "10b541e3-652d-45eb-9dce-b94de6252fdd", 00:12:47.604 "strip_size_kb": 64, 00:12:47.604 "state": "configuring", 00:12:47.604 "raid_level": "raid0", 00:12:47.604 "superblock": true, 00:12:47.604 "num_base_bdevs": 3, 00:12:47.604 "num_base_bdevs_discovered": 1, 00:12:47.604 "num_base_bdevs_operational": 3, 00:12:47.604 "base_bdevs_list": [ 00:12:47.604 { 00:12:47.604 "name": "BaseBdev1", 00:12:47.604 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:47.604 "is_configured": true, 00:12:47.604 "data_offset": 2048, 00:12:47.604 "data_size": 63488 00:12:47.604 }, 00:12:47.604 { 00:12:47.604 "name": "BaseBdev2", 00:12:47.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.604 "is_configured": false, 00:12:47.604 "data_offset": 0, 00:12:47.604 "data_size": 0 00:12:47.604 }, 00:12:47.605 { 00:12:47.605 "name": "BaseBdev3", 00:12:47.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.605 "is_configured": false, 00:12:47.605 "data_offset": 0, 00:12:47.605 "data_size": 0 00:12:47.605 } 00:12:47.605 ] 00:12:47.605 }' 00:12:47.605 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.605 10:08:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.174 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:48.174 [2024-06-10 10:08:09.945995] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:48.174 [2024-06-10 10:08:09.946020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10d03f0 name Existed_Raid, state configuring 00:12:48.174 10:08:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:48.434 [2024-06-10 10:08:10.126495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:48.434 [2024-06-10 10:08:10.127627] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:48.434 [2024-06-10 10:08:10.127651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:48.434 [2024-06-10 10:08:10.127657] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:48.434 [2024-06-10 10:08:10.127663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.434 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.694 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.694 "name": "Existed_Raid", 00:12:48.694 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:48.694 "strip_size_kb": 64, 00:12:48.694 "state": "configuring", 00:12:48.694 "raid_level": "raid0", 00:12:48.694 "superblock": true, 00:12:48.694 "num_base_bdevs": 3, 00:12:48.694 "num_base_bdevs_discovered": 1, 00:12:48.694 "num_base_bdevs_operational": 3, 00:12:48.694 "base_bdevs_list": [ 00:12:48.694 { 00:12:48.694 "name": "BaseBdev1", 00:12:48.694 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:48.694 "is_configured": true, 00:12:48.694 "data_offset": 2048, 00:12:48.694 "data_size": 63488 00:12:48.694 }, 00:12:48.694 { 00:12:48.694 "name": "BaseBdev2", 00:12:48.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.694 "is_configured": false, 00:12:48.694 "data_offset": 0, 00:12:48.694 "data_size": 0 00:12:48.694 }, 00:12:48.694 { 00:12:48.694 "name": "BaseBdev3", 00:12:48.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.694 "is_configured": false, 00:12:48.694 "data_offset": 0, 00:12:48.694 "data_size": 0 00:12:48.694 } 00:12:48.694 ] 00:12:48.694 }' 00:12:48.694 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.694 10:08:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.264 10:08:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:49.264 [2024-06-10 10:08:11.025755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:49.264 BaseBdev2 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:49.264 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:49.524 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:49.784 [ 00:12:49.784 { 00:12:49.784 "name": "BaseBdev2", 00:12:49.784 "aliases": [ 00:12:49.784 "2a1aa796-d854-47e8-8e4e-816890a907e3" 00:12:49.784 ], 00:12:49.784 "product_name": "Malloc disk", 00:12:49.784 "block_size": 512, 00:12:49.784 "num_blocks": 65536, 00:12:49.784 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:49.784 "assigned_rate_limits": { 00:12:49.784 "rw_ios_per_sec": 0, 00:12:49.784 "rw_mbytes_per_sec": 0, 00:12:49.784 "r_mbytes_per_sec": 0, 00:12:49.784 "w_mbytes_per_sec": 0 00:12:49.784 }, 00:12:49.784 "claimed": true, 00:12:49.784 "claim_type": "exclusive_write", 00:12:49.784 "zoned": false, 00:12:49.784 "supported_io_types": { 00:12:49.784 "read": true, 00:12:49.784 "write": true, 00:12:49.784 "unmap": true, 00:12:49.784 "write_zeroes": true, 00:12:49.784 "flush": true, 00:12:49.784 "reset": true, 00:12:49.784 "compare": false, 00:12:49.784 "compare_and_write": false, 00:12:49.784 "abort": true, 00:12:49.784 "nvme_admin": false, 00:12:49.784 "nvme_io": false 00:12:49.784 }, 00:12:49.784 "memory_domains": [ 00:12:49.784 { 00:12:49.784 "dma_device_id": "system", 00:12:49.784 "dma_device_type": 1 00:12:49.784 }, 00:12:49.784 { 00:12:49.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.784 "dma_device_type": 2 00:12:49.784 } 00:12:49.784 ], 00:12:49.784 "driver_specific": {} 00:12:49.784 } 00:12:49.784 ] 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.784 "name": "Existed_Raid", 00:12:49.784 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:49.784 "strip_size_kb": 64, 00:12:49.784 "state": "configuring", 00:12:49.784 "raid_level": "raid0", 00:12:49.784 "superblock": true, 00:12:49.784 "num_base_bdevs": 3, 00:12:49.784 "num_base_bdevs_discovered": 2, 00:12:49.784 "num_base_bdevs_operational": 3, 00:12:49.784 "base_bdevs_list": [ 00:12:49.784 { 00:12:49.784 "name": "BaseBdev1", 00:12:49.784 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:49.784 "is_configured": true, 00:12:49.784 "data_offset": 2048, 00:12:49.784 "data_size": 63488 00:12:49.784 }, 00:12:49.784 { 00:12:49.784 "name": "BaseBdev2", 00:12:49.784 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:49.784 "is_configured": true, 00:12:49.784 "data_offset": 2048, 00:12:49.784 "data_size": 63488 00:12:49.784 }, 00:12:49.784 { 00:12:49.784 "name": "BaseBdev3", 00:12:49.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.784 "is_configured": false, 00:12:49.784 "data_offset": 0, 00:12:49.784 "data_size": 0 00:12:49.784 } 00:12:49.784 ] 00:12:49.784 }' 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.784 10:08:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:50.354 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:50.615 [2024-06-10 10:08:12.321926] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:50.615 [2024-06-10 10:08:12.322038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10d12c0 00:12:50.615 [2024-06-10 10:08:12.322046] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:50.615 [2024-06-10 10:08:12.322180] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1274970 00:12:50.615 [2024-06-10 10:08:12.322263] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10d12c0 00:12:50.615 [2024-06-10 10:08:12.322269] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10d12c0 00:12:50.615 [2024-06-10 10:08:12.322335] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:50.615 BaseBdev3 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:50.615 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:50.876 [ 00:12:50.876 { 00:12:50.876 "name": "BaseBdev3", 00:12:50.876 "aliases": [ 00:12:50.876 "50eb5adf-0a21-465f-89a1-487934c07408" 00:12:50.876 ], 00:12:50.876 "product_name": "Malloc disk", 00:12:50.876 "block_size": 512, 00:12:50.876 "num_blocks": 65536, 00:12:50.876 "uuid": "50eb5adf-0a21-465f-89a1-487934c07408", 00:12:50.876 "assigned_rate_limits": { 00:12:50.876 "rw_ios_per_sec": 0, 00:12:50.876 "rw_mbytes_per_sec": 0, 00:12:50.876 "r_mbytes_per_sec": 0, 00:12:50.876 "w_mbytes_per_sec": 0 00:12:50.876 }, 00:12:50.876 "claimed": true, 00:12:50.876 "claim_type": "exclusive_write", 00:12:50.876 "zoned": false, 00:12:50.876 "supported_io_types": { 00:12:50.876 "read": true, 00:12:50.876 "write": true, 00:12:50.876 "unmap": true, 00:12:50.876 "write_zeroes": true, 00:12:50.876 "flush": true, 00:12:50.876 "reset": true, 00:12:50.876 "compare": false, 00:12:50.876 "compare_and_write": false, 00:12:50.876 "abort": true, 00:12:50.876 "nvme_admin": false, 00:12:50.876 "nvme_io": false 00:12:50.876 }, 00:12:50.876 "memory_domains": [ 00:12:50.876 { 00:12:50.876 "dma_device_id": "system", 00:12:50.876 "dma_device_type": 1 00:12:50.876 }, 00:12:50.876 { 00:12:50.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.876 "dma_device_type": 2 00:12:50.876 } 00:12:50.876 ], 00:12:50.876 "driver_specific": {} 00:12:50.876 } 00:12:50.876 ] 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.876 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.136 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.136 "name": "Existed_Raid", 00:12:51.136 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:51.136 "strip_size_kb": 64, 00:12:51.136 "state": "online", 00:12:51.136 "raid_level": "raid0", 00:12:51.136 "superblock": true, 00:12:51.136 "num_base_bdevs": 3, 00:12:51.136 "num_base_bdevs_discovered": 3, 00:12:51.136 "num_base_bdevs_operational": 3, 00:12:51.136 "base_bdevs_list": [ 00:12:51.136 { 00:12:51.136 "name": "BaseBdev1", 00:12:51.136 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:51.136 "is_configured": true, 00:12:51.136 "data_offset": 2048, 00:12:51.136 "data_size": 63488 00:12:51.136 }, 00:12:51.136 { 00:12:51.136 "name": "BaseBdev2", 00:12:51.136 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:51.136 "is_configured": true, 00:12:51.136 "data_offset": 2048, 00:12:51.136 "data_size": 63488 00:12:51.136 }, 00:12:51.136 { 00:12:51.136 "name": "BaseBdev3", 00:12:51.136 "uuid": "50eb5adf-0a21-465f-89a1-487934c07408", 00:12:51.136 "is_configured": true, 00:12:51.136 "data_offset": 2048, 00:12:51.137 "data_size": 63488 00:12:51.137 } 00:12:51.137 ] 00:12:51.137 }' 00:12:51.137 10:08:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.137 10:08:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:51.707 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:51.968 [2024-06-10 10:08:13.617416] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:51.968 "name": "Existed_Raid", 00:12:51.968 "aliases": [ 00:12:51.968 "6f3ac01c-9466-42fb-af78-cfb7269017f4" 00:12:51.968 ], 00:12:51.968 "product_name": "Raid Volume", 00:12:51.968 "block_size": 512, 00:12:51.968 "num_blocks": 190464, 00:12:51.968 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:51.968 "assigned_rate_limits": { 00:12:51.968 "rw_ios_per_sec": 0, 00:12:51.968 "rw_mbytes_per_sec": 0, 00:12:51.968 "r_mbytes_per_sec": 0, 00:12:51.968 "w_mbytes_per_sec": 0 00:12:51.968 }, 00:12:51.968 "claimed": false, 00:12:51.968 "zoned": false, 00:12:51.968 "supported_io_types": { 00:12:51.968 "read": true, 00:12:51.968 "write": true, 00:12:51.968 "unmap": true, 00:12:51.968 "write_zeroes": true, 00:12:51.968 "flush": true, 00:12:51.968 "reset": true, 00:12:51.968 "compare": false, 00:12:51.968 "compare_and_write": false, 00:12:51.968 "abort": false, 00:12:51.968 "nvme_admin": false, 00:12:51.968 "nvme_io": false 00:12:51.968 }, 00:12:51.968 "memory_domains": [ 00:12:51.968 { 00:12:51.968 "dma_device_id": "system", 00:12:51.968 "dma_device_type": 1 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.968 "dma_device_type": 2 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "dma_device_id": "system", 00:12:51.968 "dma_device_type": 1 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.968 "dma_device_type": 2 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "dma_device_id": "system", 00:12:51.968 "dma_device_type": 1 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.968 "dma_device_type": 2 00:12:51.968 } 00:12:51.968 ], 00:12:51.968 "driver_specific": { 00:12:51.968 "raid": { 00:12:51.968 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:51.968 "strip_size_kb": 64, 00:12:51.968 "state": "online", 00:12:51.968 "raid_level": "raid0", 00:12:51.968 "superblock": true, 00:12:51.968 "num_base_bdevs": 3, 00:12:51.968 "num_base_bdevs_discovered": 3, 00:12:51.968 "num_base_bdevs_operational": 3, 00:12:51.968 "base_bdevs_list": [ 00:12:51.968 { 00:12:51.968 "name": "BaseBdev1", 00:12:51.968 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:51.968 "is_configured": true, 00:12:51.968 "data_offset": 2048, 00:12:51.968 "data_size": 63488 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "name": "BaseBdev2", 00:12:51.968 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:51.968 "is_configured": true, 00:12:51.968 "data_offset": 2048, 00:12:51.968 "data_size": 63488 00:12:51.968 }, 00:12:51.968 { 00:12:51.968 "name": "BaseBdev3", 00:12:51.968 "uuid": "50eb5adf-0a21-465f-89a1-487934c07408", 00:12:51.968 "is_configured": true, 00:12:51.968 "data_offset": 2048, 00:12:51.968 "data_size": 63488 00:12:51.968 } 00:12:51.968 ] 00:12:51.968 } 00:12:51.968 } 00:12:51.968 }' 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:51.968 BaseBdev2 00:12:51.968 BaseBdev3' 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:51.968 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:52.228 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:52.228 "name": "BaseBdev1", 00:12:52.228 "aliases": [ 00:12:52.228 "612e4782-d58d-4ca4-9efe-8516f1c59de5" 00:12:52.228 ], 00:12:52.228 "product_name": "Malloc disk", 00:12:52.228 "block_size": 512, 00:12:52.228 "num_blocks": 65536, 00:12:52.228 "uuid": "612e4782-d58d-4ca4-9efe-8516f1c59de5", 00:12:52.228 "assigned_rate_limits": { 00:12:52.228 "rw_ios_per_sec": 0, 00:12:52.228 "rw_mbytes_per_sec": 0, 00:12:52.228 "r_mbytes_per_sec": 0, 00:12:52.228 "w_mbytes_per_sec": 0 00:12:52.228 }, 00:12:52.228 "claimed": true, 00:12:52.228 "claim_type": "exclusive_write", 00:12:52.228 "zoned": false, 00:12:52.228 "supported_io_types": { 00:12:52.228 "read": true, 00:12:52.228 "write": true, 00:12:52.228 "unmap": true, 00:12:52.228 "write_zeroes": true, 00:12:52.228 "flush": true, 00:12:52.228 "reset": true, 00:12:52.228 "compare": false, 00:12:52.228 "compare_and_write": false, 00:12:52.228 "abort": true, 00:12:52.228 "nvme_admin": false, 00:12:52.228 "nvme_io": false 00:12:52.228 }, 00:12:52.228 "memory_domains": [ 00:12:52.228 { 00:12:52.228 "dma_device_id": "system", 00:12:52.228 "dma_device_type": 1 00:12:52.228 }, 00:12:52.228 { 00:12:52.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.228 "dma_device_type": 2 00:12:52.228 } 00:12:52.228 ], 00:12:52.229 "driver_specific": {} 00:12:52.229 }' 00:12:52.229 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.229 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.229 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.229 10:08:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.229 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.229 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.229 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.229 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:52.489 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:52.749 "name": "BaseBdev2", 00:12:52.749 "aliases": [ 00:12:52.749 "2a1aa796-d854-47e8-8e4e-816890a907e3" 00:12:52.749 ], 00:12:52.749 "product_name": "Malloc disk", 00:12:52.749 "block_size": 512, 00:12:52.749 "num_blocks": 65536, 00:12:52.749 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:52.749 "assigned_rate_limits": { 00:12:52.749 "rw_ios_per_sec": 0, 00:12:52.749 "rw_mbytes_per_sec": 0, 00:12:52.749 "r_mbytes_per_sec": 0, 00:12:52.749 "w_mbytes_per_sec": 0 00:12:52.749 }, 00:12:52.749 "claimed": true, 00:12:52.749 "claim_type": "exclusive_write", 00:12:52.749 "zoned": false, 00:12:52.749 "supported_io_types": { 00:12:52.749 "read": true, 00:12:52.749 "write": true, 00:12:52.749 "unmap": true, 00:12:52.749 "write_zeroes": true, 00:12:52.749 "flush": true, 00:12:52.749 "reset": true, 00:12:52.749 "compare": false, 00:12:52.749 "compare_and_write": false, 00:12:52.749 "abort": true, 00:12:52.749 "nvme_admin": false, 00:12:52.749 "nvme_io": false 00:12:52.749 }, 00:12:52.749 "memory_domains": [ 00:12:52.749 { 00:12:52.749 "dma_device_id": "system", 00:12:52.749 "dma_device_type": 1 00:12:52.749 }, 00:12:52.749 { 00:12:52.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.749 "dma_device_type": 2 00:12:52.749 } 00:12:52.749 ], 00:12:52.749 "driver_specific": {} 00:12:52.749 }' 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.749 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:53.009 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.271 "name": "BaseBdev3", 00:12:53.271 "aliases": [ 00:12:53.271 "50eb5adf-0a21-465f-89a1-487934c07408" 00:12:53.271 ], 00:12:53.271 "product_name": "Malloc disk", 00:12:53.271 "block_size": 512, 00:12:53.271 "num_blocks": 65536, 00:12:53.271 "uuid": "50eb5adf-0a21-465f-89a1-487934c07408", 00:12:53.271 "assigned_rate_limits": { 00:12:53.271 "rw_ios_per_sec": 0, 00:12:53.271 "rw_mbytes_per_sec": 0, 00:12:53.271 "r_mbytes_per_sec": 0, 00:12:53.271 "w_mbytes_per_sec": 0 00:12:53.271 }, 00:12:53.271 "claimed": true, 00:12:53.271 "claim_type": "exclusive_write", 00:12:53.271 "zoned": false, 00:12:53.271 "supported_io_types": { 00:12:53.271 "read": true, 00:12:53.271 "write": true, 00:12:53.271 "unmap": true, 00:12:53.271 "write_zeroes": true, 00:12:53.271 "flush": true, 00:12:53.271 "reset": true, 00:12:53.271 "compare": false, 00:12:53.271 "compare_and_write": false, 00:12:53.271 "abort": true, 00:12:53.271 "nvme_admin": false, 00:12:53.271 "nvme_io": false 00:12:53.271 }, 00:12:53.271 "memory_domains": [ 00:12:53.271 { 00:12:53.271 "dma_device_id": "system", 00:12:53.271 "dma_device_type": 1 00:12:53.271 }, 00:12:53.271 { 00:12:53.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.271 "dma_device_type": 2 00:12:53.271 } 00:12:53.271 ], 00:12:53.271 "driver_specific": {} 00:12:53.271 }' 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.271 10:08:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.271 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.271 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.271 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.271 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.271 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.531 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.531 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.531 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:53.531 [2024-06-10 10:08:15.377691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:53.531 [2024-06-10 10:08:15.377708] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:53.531 [2024-06-10 10:08:15.377737] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:53.531 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:53.531 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.532 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.793 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.793 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.793 "name": "Existed_Raid", 00:12:53.793 "uuid": "6f3ac01c-9466-42fb-af78-cfb7269017f4", 00:12:53.793 "strip_size_kb": 64, 00:12:53.793 "state": "offline", 00:12:53.793 "raid_level": "raid0", 00:12:53.793 "superblock": true, 00:12:53.793 "num_base_bdevs": 3, 00:12:53.793 "num_base_bdevs_discovered": 2, 00:12:53.793 "num_base_bdevs_operational": 2, 00:12:53.793 "base_bdevs_list": [ 00:12:53.793 { 00:12:53.793 "name": null, 00:12:53.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.793 "is_configured": false, 00:12:53.793 "data_offset": 2048, 00:12:53.793 "data_size": 63488 00:12:53.793 }, 00:12:53.793 { 00:12:53.793 "name": "BaseBdev2", 00:12:53.793 "uuid": "2a1aa796-d854-47e8-8e4e-816890a907e3", 00:12:53.793 "is_configured": true, 00:12:53.793 "data_offset": 2048, 00:12:53.793 "data_size": 63488 00:12:53.793 }, 00:12:53.793 { 00:12:53.793 "name": "BaseBdev3", 00:12:53.793 "uuid": "50eb5adf-0a21-465f-89a1-487934c07408", 00:12:53.793 "is_configured": true, 00:12:53.793 "data_offset": 2048, 00:12:53.793 "data_size": 63488 00:12:53.793 } 00:12:53.793 ] 00:12:53.793 }' 00:12:53.793 10:08:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.793 10:08:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.365 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:54.365 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:54.365 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.365 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:54.625 [2024-06-10 10:08:16.456424] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.625 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:54.886 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:54.886 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:54.886 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:55.146 [2024-06-10 10:08:16.839326] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:55.146 [2024-06-10 10:08:16.839355] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10d12c0 name Existed_Raid, state offline 00:12:55.146 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:55.146 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:55.146 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.146 10:08:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:55.407 BaseBdev2 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:55.407 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:55.666 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:55.926 [ 00:12:55.926 { 00:12:55.926 "name": "BaseBdev2", 00:12:55.926 "aliases": [ 00:12:55.926 "0962c7aa-c41f-4782-bca0-8edd5baefda1" 00:12:55.926 ], 00:12:55.926 "product_name": "Malloc disk", 00:12:55.926 "block_size": 512, 00:12:55.926 "num_blocks": 65536, 00:12:55.926 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:12:55.926 "assigned_rate_limits": { 00:12:55.926 "rw_ios_per_sec": 0, 00:12:55.926 "rw_mbytes_per_sec": 0, 00:12:55.926 "r_mbytes_per_sec": 0, 00:12:55.926 "w_mbytes_per_sec": 0 00:12:55.926 }, 00:12:55.926 "claimed": false, 00:12:55.926 "zoned": false, 00:12:55.926 "supported_io_types": { 00:12:55.926 "read": true, 00:12:55.926 "write": true, 00:12:55.926 "unmap": true, 00:12:55.926 "write_zeroes": true, 00:12:55.926 "flush": true, 00:12:55.926 "reset": true, 00:12:55.926 "compare": false, 00:12:55.926 "compare_and_write": false, 00:12:55.926 "abort": true, 00:12:55.926 "nvme_admin": false, 00:12:55.926 "nvme_io": false 00:12:55.926 }, 00:12:55.926 "memory_domains": [ 00:12:55.926 { 00:12:55.926 "dma_device_id": "system", 00:12:55.926 "dma_device_type": 1 00:12:55.926 }, 00:12:55.926 { 00:12:55.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.926 "dma_device_type": 2 00:12:55.926 } 00:12:55.926 ], 00:12:55.926 "driver_specific": {} 00:12:55.926 } 00:12:55.926 ] 00:12:55.926 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:55.926 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:55.926 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:55.926 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:55.926 BaseBdev3 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:56.185 10:08:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:56.445 [ 00:12:56.445 { 00:12:56.445 "name": "BaseBdev3", 00:12:56.445 "aliases": [ 00:12:56.445 "0a7057ef-ebf0-427b-8014-0379d8788fdf" 00:12:56.445 ], 00:12:56.445 "product_name": "Malloc disk", 00:12:56.445 "block_size": 512, 00:12:56.445 "num_blocks": 65536, 00:12:56.445 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:12:56.445 "assigned_rate_limits": { 00:12:56.445 "rw_ios_per_sec": 0, 00:12:56.445 "rw_mbytes_per_sec": 0, 00:12:56.445 "r_mbytes_per_sec": 0, 00:12:56.445 "w_mbytes_per_sec": 0 00:12:56.445 }, 00:12:56.445 "claimed": false, 00:12:56.445 "zoned": false, 00:12:56.445 "supported_io_types": { 00:12:56.445 "read": true, 00:12:56.445 "write": true, 00:12:56.445 "unmap": true, 00:12:56.445 "write_zeroes": true, 00:12:56.445 "flush": true, 00:12:56.445 "reset": true, 00:12:56.445 "compare": false, 00:12:56.445 "compare_and_write": false, 00:12:56.445 "abort": true, 00:12:56.445 "nvme_admin": false, 00:12:56.445 "nvme_io": false 00:12:56.445 }, 00:12:56.445 "memory_domains": [ 00:12:56.445 { 00:12:56.445 "dma_device_id": "system", 00:12:56.445 "dma_device_type": 1 00:12:56.445 }, 00:12:56.445 { 00:12:56.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.445 "dma_device_type": 2 00:12:56.445 } 00:12:56.445 ], 00:12:56.445 "driver_specific": {} 00:12:56.446 } 00:12:56.446 ] 00:12:56.446 10:08:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:56.446 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:56.446 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:56.446 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:56.705 [2024-06-10 10:08:18.334980] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:56.705 [2024-06-10 10:08:18.335009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:56.705 [2024-06-10 10:08:18.335021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:56.705 [2024-06-10 10:08:18.336061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.705 "name": "Existed_Raid", 00:12:56.705 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:12:56.705 "strip_size_kb": 64, 00:12:56.705 "state": "configuring", 00:12:56.705 "raid_level": "raid0", 00:12:56.705 "superblock": true, 00:12:56.705 "num_base_bdevs": 3, 00:12:56.705 "num_base_bdevs_discovered": 2, 00:12:56.705 "num_base_bdevs_operational": 3, 00:12:56.705 "base_bdevs_list": [ 00:12:56.705 { 00:12:56.705 "name": "BaseBdev1", 00:12:56.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.705 "is_configured": false, 00:12:56.705 "data_offset": 0, 00:12:56.705 "data_size": 0 00:12:56.705 }, 00:12:56.705 { 00:12:56.705 "name": "BaseBdev2", 00:12:56.705 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:12:56.705 "is_configured": true, 00:12:56.705 "data_offset": 2048, 00:12:56.705 "data_size": 63488 00:12:56.705 }, 00:12:56.705 { 00:12:56.705 "name": "BaseBdev3", 00:12:56.705 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:12:56.705 "is_configured": true, 00:12:56.705 "data_offset": 2048, 00:12:56.705 "data_size": 63488 00:12:56.705 } 00:12:56.705 ] 00:12:56.705 }' 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.705 10:08:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:57.275 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:57.534 [2024-06-10 10:08:19.269311] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.534 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.794 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.794 "name": "Existed_Raid", 00:12:57.794 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:12:57.794 "strip_size_kb": 64, 00:12:57.794 "state": "configuring", 00:12:57.794 "raid_level": "raid0", 00:12:57.794 "superblock": true, 00:12:57.794 "num_base_bdevs": 3, 00:12:57.794 "num_base_bdevs_discovered": 1, 00:12:57.794 "num_base_bdevs_operational": 3, 00:12:57.794 "base_bdevs_list": [ 00:12:57.794 { 00:12:57.794 "name": "BaseBdev1", 00:12:57.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.794 "is_configured": false, 00:12:57.794 "data_offset": 0, 00:12:57.794 "data_size": 0 00:12:57.794 }, 00:12:57.794 { 00:12:57.794 "name": null, 00:12:57.794 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:12:57.794 "is_configured": false, 00:12:57.794 "data_offset": 2048, 00:12:57.794 "data_size": 63488 00:12:57.794 }, 00:12:57.794 { 00:12:57.794 "name": "BaseBdev3", 00:12:57.794 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:12:57.794 "is_configured": true, 00:12:57.794 "data_offset": 2048, 00:12:57.794 "data_size": 63488 00:12:57.794 } 00:12:57.794 ] 00:12:57.794 }' 00:12:57.794 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.794 10:08:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:58.365 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.365 10:08:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:58.365 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:58.365 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:58.624 [2024-06-10 10:08:20.364899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:58.624 BaseBdev1 00:12:58.624 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:58.624 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:58.625 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:58.625 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:58.625 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:58.625 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:58.625 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:58.884 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:58.884 [ 00:12:58.884 { 00:12:58.884 "name": "BaseBdev1", 00:12:58.884 "aliases": [ 00:12:58.884 "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9" 00:12:58.884 ], 00:12:58.884 "product_name": "Malloc disk", 00:12:58.884 "block_size": 512, 00:12:58.884 "num_blocks": 65536, 00:12:58.885 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:12:58.885 "assigned_rate_limits": { 00:12:58.885 "rw_ios_per_sec": 0, 00:12:58.885 "rw_mbytes_per_sec": 0, 00:12:58.885 "r_mbytes_per_sec": 0, 00:12:58.885 "w_mbytes_per_sec": 0 00:12:58.885 }, 00:12:58.885 "claimed": true, 00:12:58.885 "claim_type": "exclusive_write", 00:12:58.885 "zoned": false, 00:12:58.885 "supported_io_types": { 00:12:58.885 "read": true, 00:12:58.885 "write": true, 00:12:58.885 "unmap": true, 00:12:58.885 "write_zeroes": true, 00:12:58.885 "flush": true, 00:12:58.885 "reset": true, 00:12:58.885 "compare": false, 00:12:58.885 "compare_and_write": false, 00:12:58.885 "abort": true, 00:12:58.885 "nvme_admin": false, 00:12:58.885 "nvme_io": false 00:12:58.885 }, 00:12:58.885 "memory_domains": [ 00:12:58.885 { 00:12:58.885 "dma_device_id": "system", 00:12:58.885 "dma_device_type": 1 00:12:58.885 }, 00:12:58.885 { 00:12:58.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.885 "dma_device_type": 2 00:12:58.885 } 00:12:58.885 ], 00:12:58.885 "driver_specific": {} 00:12:58.885 } 00:12:58.885 ] 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.145 "name": "Existed_Raid", 00:12:59.145 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:12:59.145 "strip_size_kb": 64, 00:12:59.145 "state": "configuring", 00:12:59.145 "raid_level": "raid0", 00:12:59.145 "superblock": true, 00:12:59.145 "num_base_bdevs": 3, 00:12:59.145 "num_base_bdevs_discovered": 2, 00:12:59.145 "num_base_bdevs_operational": 3, 00:12:59.145 "base_bdevs_list": [ 00:12:59.145 { 00:12:59.145 "name": "BaseBdev1", 00:12:59.145 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:12:59.145 "is_configured": true, 00:12:59.145 "data_offset": 2048, 00:12:59.145 "data_size": 63488 00:12:59.145 }, 00:12:59.145 { 00:12:59.145 "name": null, 00:12:59.145 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:12:59.145 "is_configured": false, 00:12:59.145 "data_offset": 2048, 00:12:59.145 "data_size": 63488 00:12:59.145 }, 00:12:59.145 { 00:12:59.145 "name": "BaseBdev3", 00:12:59.145 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:12:59.145 "is_configured": true, 00:12:59.145 "data_offset": 2048, 00:12:59.145 "data_size": 63488 00:12:59.145 } 00:12:59.145 ] 00:12:59.145 }' 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.145 10:08:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.715 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:59.715 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.975 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:59.975 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:00.235 [2024-06-10 10:08:21.848670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.235 10:08:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.235 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.235 "name": "Existed_Raid", 00:13:00.235 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:00.235 "strip_size_kb": 64, 00:13:00.235 "state": "configuring", 00:13:00.235 "raid_level": "raid0", 00:13:00.235 "superblock": true, 00:13:00.235 "num_base_bdevs": 3, 00:13:00.235 "num_base_bdevs_discovered": 1, 00:13:00.235 "num_base_bdevs_operational": 3, 00:13:00.235 "base_bdevs_list": [ 00:13:00.235 { 00:13:00.235 "name": "BaseBdev1", 00:13:00.235 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:00.235 "is_configured": true, 00:13:00.235 "data_offset": 2048, 00:13:00.235 "data_size": 63488 00:13:00.235 }, 00:13:00.235 { 00:13:00.235 "name": null, 00:13:00.235 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:00.235 "is_configured": false, 00:13:00.235 "data_offset": 2048, 00:13:00.235 "data_size": 63488 00:13:00.235 }, 00:13:00.235 { 00:13:00.235 "name": null, 00:13:00.235 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:00.235 "is_configured": false, 00:13:00.235 "data_offset": 2048, 00:13:00.235 "data_size": 63488 00:13:00.235 } 00:13:00.235 ] 00:13:00.235 }' 00:13:00.235 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.235 10:08:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:00.806 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.806 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:01.101 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:01.101 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:01.361 [2024-06-10 10:08:22.975540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.361 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.362 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.362 10:08:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.362 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.362 "name": "Existed_Raid", 00:13:01.362 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:01.362 "strip_size_kb": 64, 00:13:01.362 "state": "configuring", 00:13:01.362 "raid_level": "raid0", 00:13:01.362 "superblock": true, 00:13:01.362 "num_base_bdevs": 3, 00:13:01.362 "num_base_bdevs_discovered": 2, 00:13:01.362 "num_base_bdevs_operational": 3, 00:13:01.362 "base_bdevs_list": [ 00:13:01.362 { 00:13:01.362 "name": "BaseBdev1", 00:13:01.362 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:01.362 "is_configured": true, 00:13:01.362 "data_offset": 2048, 00:13:01.362 "data_size": 63488 00:13:01.362 }, 00:13:01.362 { 00:13:01.362 "name": null, 00:13:01.362 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:01.362 "is_configured": false, 00:13:01.362 "data_offset": 2048, 00:13:01.362 "data_size": 63488 00:13:01.362 }, 00:13:01.362 { 00:13:01.362 "name": "BaseBdev3", 00:13:01.362 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:01.362 "is_configured": true, 00:13:01.362 "data_offset": 2048, 00:13:01.362 "data_size": 63488 00:13:01.362 } 00:13:01.362 ] 00:13:01.362 }' 00:13:01.362 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.362 10:08:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.932 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:01.932 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.192 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:02.192 10:08:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:02.192 [2024-06-10 10:08:24.054287] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.452 "name": "Existed_Raid", 00:13:02.452 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:02.452 "strip_size_kb": 64, 00:13:02.452 "state": "configuring", 00:13:02.452 "raid_level": "raid0", 00:13:02.452 "superblock": true, 00:13:02.452 "num_base_bdevs": 3, 00:13:02.452 "num_base_bdevs_discovered": 1, 00:13:02.452 "num_base_bdevs_operational": 3, 00:13:02.452 "base_bdevs_list": [ 00:13:02.452 { 00:13:02.452 "name": null, 00:13:02.452 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:02.452 "is_configured": false, 00:13:02.452 "data_offset": 2048, 00:13:02.452 "data_size": 63488 00:13:02.452 }, 00:13:02.452 { 00:13:02.452 "name": null, 00:13:02.452 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:02.452 "is_configured": false, 00:13:02.452 "data_offset": 2048, 00:13:02.452 "data_size": 63488 00:13:02.452 }, 00:13:02.452 { 00:13:02.452 "name": "BaseBdev3", 00:13:02.452 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:02.452 "is_configured": true, 00:13:02.452 "data_offset": 2048, 00:13:02.452 "data_size": 63488 00:13:02.452 } 00:13:02.452 ] 00:13:02.452 }' 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.452 10:08:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:03.022 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.022 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:03.282 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:03.283 10:08:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:03.283 [2024-06-10 10:08:25.142803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.543 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.543 "name": "Existed_Raid", 00:13:03.543 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:03.543 "strip_size_kb": 64, 00:13:03.543 "state": "configuring", 00:13:03.543 "raid_level": "raid0", 00:13:03.543 "superblock": true, 00:13:03.543 "num_base_bdevs": 3, 00:13:03.543 "num_base_bdevs_discovered": 2, 00:13:03.543 "num_base_bdevs_operational": 3, 00:13:03.543 "base_bdevs_list": [ 00:13:03.543 { 00:13:03.543 "name": null, 00:13:03.543 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:03.544 "is_configured": false, 00:13:03.544 "data_offset": 2048, 00:13:03.544 "data_size": 63488 00:13:03.544 }, 00:13:03.544 { 00:13:03.544 "name": "BaseBdev2", 00:13:03.544 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:03.544 "is_configured": true, 00:13:03.544 "data_offset": 2048, 00:13:03.544 "data_size": 63488 00:13:03.544 }, 00:13:03.544 { 00:13:03.544 "name": "BaseBdev3", 00:13:03.544 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:03.544 "is_configured": true, 00:13:03.544 "data_offset": 2048, 00:13:03.544 "data_size": 63488 00:13:03.544 } 00:13:03.544 ] 00:13:03.544 }' 00:13:03.544 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.544 10:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:04.114 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.114 10:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:04.374 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:04.374 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.374 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:04.634 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9 00:13:04.635 [2024-06-10 10:08:26.374946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:04.635 [2024-06-10 10:08:26.375051] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10cfd10 00:13:04.635 [2024-06-10 10:08:26.375058] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:04.635 [2024-06-10 10:08:26.375190] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdce240 00:13:04.635 [2024-06-10 10:08:26.375280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10cfd10 00:13:04.635 [2024-06-10 10:08:26.375286] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10cfd10 00:13:04.635 [2024-06-10 10:08:26.375352] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.635 NewBaseBdev 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:04.635 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:04.895 [ 00:13:04.895 { 00:13:04.895 "name": "NewBaseBdev", 00:13:04.895 "aliases": [ 00:13:04.895 "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9" 00:13:04.895 ], 00:13:04.895 "product_name": "Malloc disk", 00:13:04.895 "block_size": 512, 00:13:04.895 "num_blocks": 65536, 00:13:04.895 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:04.895 "assigned_rate_limits": { 00:13:04.895 "rw_ios_per_sec": 0, 00:13:04.895 "rw_mbytes_per_sec": 0, 00:13:04.895 "r_mbytes_per_sec": 0, 00:13:04.895 "w_mbytes_per_sec": 0 00:13:04.895 }, 00:13:04.895 "claimed": true, 00:13:04.895 "claim_type": "exclusive_write", 00:13:04.895 "zoned": false, 00:13:04.895 "supported_io_types": { 00:13:04.895 "read": true, 00:13:04.895 "write": true, 00:13:04.895 "unmap": true, 00:13:04.895 "write_zeroes": true, 00:13:04.895 "flush": true, 00:13:04.895 "reset": true, 00:13:04.895 "compare": false, 00:13:04.895 "compare_and_write": false, 00:13:04.895 "abort": true, 00:13:04.895 "nvme_admin": false, 00:13:04.895 "nvme_io": false 00:13:04.895 }, 00:13:04.895 "memory_domains": [ 00:13:04.895 { 00:13:04.895 "dma_device_id": "system", 00:13:04.895 "dma_device_type": 1 00:13:04.895 }, 00:13:04.895 { 00:13:04.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.895 "dma_device_type": 2 00:13:04.895 } 00:13:04.895 ], 00:13:04.895 "driver_specific": {} 00:13:04.895 } 00:13:04.895 ] 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.895 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.156 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.156 "name": "Existed_Raid", 00:13:05.156 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:05.156 "strip_size_kb": 64, 00:13:05.156 "state": "online", 00:13:05.156 "raid_level": "raid0", 00:13:05.156 "superblock": true, 00:13:05.156 "num_base_bdevs": 3, 00:13:05.156 "num_base_bdevs_discovered": 3, 00:13:05.156 "num_base_bdevs_operational": 3, 00:13:05.156 "base_bdevs_list": [ 00:13:05.156 { 00:13:05.156 "name": "NewBaseBdev", 00:13:05.156 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:05.156 "is_configured": true, 00:13:05.156 "data_offset": 2048, 00:13:05.156 "data_size": 63488 00:13:05.156 }, 00:13:05.156 { 00:13:05.156 "name": "BaseBdev2", 00:13:05.156 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:05.156 "is_configured": true, 00:13:05.156 "data_offset": 2048, 00:13:05.156 "data_size": 63488 00:13:05.156 }, 00:13:05.156 { 00:13:05.156 "name": "BaseBdev3", 00:13:05.156 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:05.156 "is_configured": true, 00:13:05.156 "data_offset": 2048, 00:13:05.156 "data_size": 63488 00:13:05.156 } 00:13:05.156 ] 00:13:05.156 }' 00:13:05.156 10:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.156 10:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:05.726 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:05.727 [2024-06-10 10:08:27.562198] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:05.727 "name": "Existed_Raid", 00:13:05.727 "aliases": [ 00:13:05.727 "6e91d988-782e-4a9d-8420-18bd5a6fa51d" 00:13:05.727 ], 00:13:05.727 "product_name": "Raid Volume", 00:13:05.727 "block_size": 512, 00:13:05.727 "num_blocks": 190464, 00:13:05.727 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:05.727 "assigned_rate_limits": { 00:13:05.727 "rw_ios_per_sec": 0, 00:13:05.727 "rw_mbytes_per_sec": 0, 00:13:05.727 "r_mbytes_per_sec": 0, 00:13:05.727 "w_mbytes_per_sec": 0 00:13:05.727 }, 00:13:05.727 "claimed": false, 00:13:05.727 "zoned": false, 00:13:05.727 "supported_io_types": { 00:13:05.727 "read": true, 00:13:05.727 "write": true, 00:13:05.727 "unmap": true, 00:13:05.727 "write_zeroes": true, 00:13:05.727 "flush": true, 00:13:05.727 "reset": true, 00:13:05.727 "compare": false, 00:13:05.727 "compare_and_write": false, 00:13:05.727 "abort": false, 00:13:05.727 "nvme_admin": false, 00:13:05.727 "nvme_io": false 00:13:05.727 }, 00:13:05.727 "memory_domains": [ 00:13:05.727 { 00:13:05.727 "dma_device_id": "system", 00:13:05.727 "dma_device_type": 1 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.727 "dma_device_type": 2 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "dma_device_id": "system", 00:13:05.727 "dma_device_type": 1 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.727 "dma_device_type": 2 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "dma_device_id": "system", 00:13:05.727 "dma_device_type": 1 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.727 "dma_device_type": 2 00:13:05.727 } 00:13:05.727 ], 00:13:05.727 "driver_specific": { 00:13:05.727 "raid": { 00:13:05.727 "uuid": "6e91d988-782e-4a9d-8420-18bd5a6fa51d", 00:13:05.727 "strip_size_kb": 64, 00:13:05.727 "state": "online", 00:13:05.727 "raid_level": "raid0", 00:13:05.727 "superblock": true, 00:13:05.727 "num_base_bdevs": 3, 00:13:05.727 "num_base_bdevs_discovered": 3, 00:13:05.727 "num_base_bdevs_operational": 3, 00:13:05.727 "base_bdevs_list": [ 00:13:05.727 { 00:13:05.727 "name": "NewBaseBdev", 00:13:05.727 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:05.727 "is_configured": true, 00:13:05.727 "data_offset": 2048, 00:13:05.727 "data_size": 63488 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "name": "BaseBdev2", 00:13:05.727 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:05.727 "is_configured": true, 00:13:05.727 "data_offset": 2048, 00:13:05.727 "data_size": 63488 00:13:05.727 }, 00:13:05.727 { 00:13:05.727 "name": "BaseBdev3", 00:13:05.727 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:05.727 "is_configured": true, 00:13:05.727 "data_offset": 2048, 00:13:05.727 "data_size": 63488 00:13:05.727 } 00:13:05.727 ] 00:13:05.727 } 00:13:05.727 } 00:13:05.727 }' 00:13:05.727 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:05.987 BaseBdev2 00:13:05.987 BaseBdev3' 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.987 "name": "NewBaseBdev", 00:13:05.987 "aliases": [ 00:13:05.987 "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9" 00:13:05.987 ], 00:13:05.987 "product_name": "Malloc disk", 00:13:05.987 "block_size": 512, 00:13:05.987 "num_blocks": 65536, 00:13:05.987 "uuid": "bfdfccba-4c1c-44ec-8eaf-90d101eaf6d9", 00:13:05.987 "assigned_rate_limits": { 00:13:05.987 "rw_ios_per_sec": 0, 00:13:05.987 "rw_mbytes_per_sec": 0, 00:13:05.987 "r_mbytes_per_sec": 0, 00:13:05.987 "w_mbytes_per_sec": 0 00:13:05.987 }, 00:13:05.987 "claimed": true, 00:13:05.987 "claim_type": "exclusive_write", 00:13:05.987 "zoned": false, 00:13:05.987 "supported_io_types": { 00:13:05.987 "read": true, 00:13:05.987 "write": true, 00:13:05.987 "unmap": true, 00:13:05.987 "write_zeroes": true, 00:13:05.987 "flush": true, 00:13:05.987 "reset": true, 00:13:05.987 "compare": false, 00:13:05.987 "compare_and_write": false, 00:13:05.987 "abort": true, 00:13:05.987 "nvme_admin": false, 00:13:05.987 "nvme_io": false 00:13:05.987 }, 00:13:05.987 "memory_domains": [ 00:13:05.987 { 00:13:05.987 "dma_device_id": "system", 00:13:05.987 "dma_device_type": 1 00:13:05.987 }, 00:13:05.987 { 00:13:05.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.987 "dma_device_type": 2 00:13:05.987 } 00:13:05.987 ], 00:13:05.987 "driver_specific": {} 00:13:05.987 }' 00:13:05.987 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.247 10:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.247 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.247 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.247 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.247 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.508 "name": "BaseBdev2", 00:13:06.508 "aliases": [ 00:13:06.508 "0962c7aa-c41f-4782-bca0-8edd5baefda1" 00:13:06.508 ], 00:13:06.508 "product_name": "Malloc disk", 00:13:06.508 "block_size": 512, 00:13:06.508 "num_blocks": 65536, 00:13:06.508 "uuid": "0962c7aa-c41f-4782-bca0-8edd5baefda1", 00:13:06.508 "assigned_rate_limits": { 00:13:06.508 "rw_ios_per_sec": 0, 00:13:06.508 "rw_mbytes_per_sec": 0, 00:13:06.508 "r_mbytes_per_sec": 0, 00:13:06.508 "w_mbytes_per_sec": 0 00:13:06.508 }, 00:13:06.508 "claimed": true, 00:13:06.508 "claim_type": "exclusive_write", 00:13:06.508 "zoned": false, 00:13:06.508 "supported_io_types": { 00:13:06.508 "read": true, 00:13:06.508 "write": true, 00:13:06.508 "unmap": true, 00:13:06.508 "write_zeroes": true, 00:13:06.508 "flush": true, 00:13:06.508 "reset": true, 00:13:06.508 "compare": false, 00:13:06.508 "compare_and_write": false, 00:13:06.508 "abort": true, 00:13:06.508 "nvme_admin": false, 00:13:06.508 "nvme_io": false 00:13:06.508 }, 00:13:06.508 "memory_domains": [ 00:13:06.508 { 00:13:06.508 "dma_device_id": "system", 00:13:06.508 "dma_device_type": 1 00:13:06.508 }, 00:13:06.508 { 00:13:06.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.508 "dma_device_type": 2 00:13:06.508 } 00:13:06.508 ], 00:13:06.508 "driver_specific": {} 00:13:06.508 }' 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.508 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.769 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.029 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:07.029 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:07.029 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:07.029 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:07.029 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:07.029 "name": "BaseBdev3", 00:13:07.029 "aliases": [ 00:13:07.029 "0a7057ef-ebf0-427b-8014-0379d8788fdf" 00:13:07.029 ], 00:13:07.029 "product_name": "Malloc disk", 00:13:07.029 "block_size": 512, 00:13:07.029 "num_blocks": 65536, 00:13:07.029 "uuid": "0a7057ef-ebf0-427b-8014-0379d8788fdf", 00:13:07.029 "assigned_rate_limits": { 00:13:07.029 "rw_ios_per_sec": 0, 00:13:07.029 "rw_mbytes_per_sec": 0, 00:13:07.029 "r_mbytes_per_sec": 0, 00:13:07.029 "w_mbytes_per_sec": 0 00:13:07.029 }, 00:13:07.029 "claimed": true, 00:13:07.029 "claim_type": "exclusive_write", 00:13:07.029 "zoned": false, 00:13:07.029 "supported_io_types": { 00:13:07.029 "read": true, 00:13:07.029 "write": true, 00:13:07.029 "unmap": true, 00:13:07.029 "write_zeroes": true, 00:13:07.029 "flush": true, 00:13:07.029 "reset": true, 00:13:07.029 "compare": false, 00:13:07.029 "compare_and_write": false, 00:13:07.029 "abort": true, 00:13:07.029 "nvme_admin": false, 00:13:07.030 "nvme_io": false 00:13:07.030 }, 00:13:07.030 "memory_domains": [ 00:13:07.030 { 00:13:07.030 "dma_device_id": "system", 00:13:07.030 "dma_device_type": 1 00:13:07.030 }, 00:13:07.030 { 00:13:07.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.030 "dma_device_type": 2 00:13:07.030 } 00:13:07.030 ], 00:13:07.030 "driver_specific": {} 00:13:07.030 }' 00:13:07.030 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.290 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.290 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:07.290 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.290 10:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.290 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:07.550 [2024-06-10 10:08:29.298387] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:07.550 [2024-06-10 10:08:29.298403] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:07.550 [2024-06-10 10:08:29.298436] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:07.550 [2024-06-10 10:08:29.298473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:07.550 [2024-06-10 10:08:29.298479] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10cfd10 name Existed_Raid, state offline 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 983470 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 983470 ']' 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 983470 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 983470 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 983470' 00:13:07.550 killing process with pid 983470 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 983470 00:13:07.550 [2024-06-10 10:08:29.363041] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.550 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 983470 00:13:07.550 [2024-06-10 10:08:29.377695] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.809 10:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:07.809 00:13:07.809 real 0m23.168s 00:13:07.809 user 0m43.432s 00:13:07.809 sys 0m3.412s 00:13:07.809 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:07.809 10:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:07.809 ************************************ 00:13:07.809 END TEST raid_state_function_test_sb 00:13:07.809 ************************************ 00:13:07.809 10:08:29 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:13:07.809 10:08:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:13:07.809 10:08:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:07.810 10:08:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.810 ************************************ 00:13:07.810 START TEST raid_superblock_test 00:13:07.810 ************************************ 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 3 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=988065 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 988065 /var/tmp/spdk-raid.sock 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 988065 ']' 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:07.810 10:08:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.810 [2024-06-10 10:08:29.638639] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:13:07.810 [2024-06-10 10:08:29.638687] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid988065 ] 00:13:08.069 [2024-06-10 10:08:29.723433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:08.069 [2024-06-10 10:08:29.785211] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:08.069 [2024-06-10 10:08:29.825961] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.069 [2024-06-10 10:08:29.825985] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.638 10:08:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:08.638 10:08:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:08.639 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:08.899 malloc1 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:08.899 [2024-06-10 10:08:30.723572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:08.899 [2024-06-10 10:08:30.723606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:08.899 [2024-06-10 10:08:30.723617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1393990 00:13:08.899 [2024-06-10 10:08:30.723624] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:08.899 [2024-06-10 10:08:30.724935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:08.899 [2024-06-10 10:08:30.724954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:08.899 pt1 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:08.899 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:09.159 malloc2 00:13:09.159 10:08:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:09.419 [2024-06-10 10:08:31.042586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:09.419 [2024-06-10 10:08:31.042615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.419 [2024-06-10 10:08:31.042626] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13944e0 00:13:09.419 [2024-06-10 10:08:31.042632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.419 [2024-06-10 10:08:31.043830] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.419 [2024-06-10 10:08:31.043849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:09.419 pt2 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:09.419 malloc3 00:13:09.419 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:09.679 [2024-06-10 10:08:31.369258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:09.679 [2024-06-10 10:08:31.369285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.679 [2024-06-10 10:08:31.369295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15404e0 00:13:09.679 [2024-06-10 10:08:31.369301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.679 [2024-06-10 10:08:31.370470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.679 [2024-06-10 10:08:31.370488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:09.679 pt3 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:09.679 [2024-06-10 10:08:31.509627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:09.679 [2024-06-10 10:08:31.510613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:09.679 [2024-06-10 10:08:31.510657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:09.679 [2024-06-10 10:08:31.510771] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1541830 00:13:09.679 [2024-06-10 10:08:31.510777] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:09.679 [2024-06-10 10:08:31.510928] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x105fdc0 00:13:09.679 [2024-06-10 10:08:31.511032] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1541830 00:13:09.679 [2024-06-10 10:08:31.511037] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1541830 00:13:09.679 [2024-06-10 10:08:31.511104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.679 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:09.939 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.939 "name": "raid_bdev1", 00:13:09.939 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:09.939 "strip_size_kb": 64, 00:13:09.939 "state": "online", 00:13:09.939 "raid_level": "raid0", 00:13:09.939 "superblock": true, 00:13:09.939 "num_base_bdevs": 3, 00:13:09.939 "num_base_bdevs_discovered": 3, 00:13:09.939 "num_base_bdevs_operational": 3, 00:13:09.939 "base_bdevs_list": [ 00:13:09.939 { 00:13:09.939 "name": "pt1", 00:13:09.939 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:09.939 "is_configured": true, 00:13:09.939 "data_offset": 2048, 00:13:09.939 "data_size": 63488 00:13:09.939 }, 00:13:09.939 { 00:13:09.939 "name": "pt2", 00:13:09.939 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:09.939 "is_configured": true, 00:13:09.939 "data_offset": 2048, 00:13:09.940 "data_size": 63488 00:13:09.940 }, 00:13:09.940 { 00:13:09.940 "name": "pt3", 00:13:09.940 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:09.940 "is_configured": true, 00:13:09.940 "data_offset": 2048, 00:13:09.940 "data_size": 63488 00:13:09.940 } 00:13:09.940 ] 00:13:09.940 }' 00:13:09.940 10:08:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.940 10:08:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:10.509 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:10.769 [2024-06-10 10:08:32.412083] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:10.769 "name": "raid_bdev1", 00:13:10.769 "aliases": [ 00:13:10.769 "79dd05fa-029a-4e57-8cbb-25b96bb53f96" 00:13:10.769 ], 00:13:10.769 "product_name": "Raid Volume", 00:13:10.769 "block_size": 512, 00:13:10.769 "num_blocks": 190464, 00:13:10.769 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:10.769 "assigned_rate_limits": { 00:13:10.769 "rw_ios_per_sec": 0, 00:13:10.769 "rw_mbytes_per_sec": 0, 00:13:10.769 "r_mbytes_per_sec": 0, 00:13:10.769 "w_mbytes_per_sec": 0 00:13:10.769 }, 00:13:10.769 "claimed": false, 00:13:10.769 "zoned": false, 00:13:10.769 "supported_io_types": { 00:13:10.769 "read": true, 00:13:10.769 "write": true, 00:13:10.769 "unmap": true, 00:13:10.769 "write_zeroes": true, 00:13:10.769 "flush": true, 00:13:10.769 "reset": true, 00:13:10.769 "compare": false, 00:13:10.769 "compare_and_write": false, 00:13:10.769 "abort": false, 00:13:10.769 "nvme_admin": false, 00:13:10.769 "nvme_io": false 00:13:10.769 }, 00:13:10.769 "memory_domains": [ 00:13:10.769 { 00:13:10.769 "dma_device_id": "system", 00:13:10.769 "dma_device_type": 1 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.769 "dma_device_type": 2 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "dma_device_id": "system", 00:13:10.769 "dma_device_type": 1 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.769 "dma_device_type": 2 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "dma_device_id": "system", 00:13:10.769 "dma_device_type": 1 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.769 "dma_device_type": 2 00:13:10.769 } 00:13:10.769 ], 00:13:10.769 "driver_specific": { 00:13:10.769 "raid": { 00:13:10.769 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:10.769 "strip_size_kb": 64, 00:13:10.769 "state": "online", 00:13:10.769 "raid_level": "raid0", 00:13:10.769 "superblock": true, 00:13:10.769 "num_base_bdevs": 3, 00:13:10.769 "num_base_bdevs_discovered": 3, 00:13:10.769 "num_base_bdevs_operational": 3, 00:13:10.769 "base_bdevs_list": [ 00:13:10.769 { 00:13:10.769 "name": "pt1", 00:13:10.769 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:10.769 "is_configured": true, 00:13:10.769 "data_offset": 2048, 00:13:10.769 "data_size": 63488 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "name": "pt2", 00:13:10.769 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:10.769 "is_configured": true, 00:13:10.769 "data_offset": 2048, 00:13:10.769 "data_size": 63488 00:13:10.769 }, 00:13:10.769 { 00:13:10.769 "name": "pt3", 00:13:10.769 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:10.769 "is_configured": true, 00:13:10.769 "data_offset": 2048, 00:13:10.769 "data_size": 63488 00:13:10.769 } 00:13:10.769 ] 00:13:10.769 } 00:13:10.769 } 00:13:10.769 }' 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:10.769 pt2 00:13:10.769 pt3' 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:10.769 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.029 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.029 "name": "pt1", 00:13:11.029 "aliases": [ 00:13:11.029 "00000000-0000-0000-0000-000000000001" 00:13:11.029 ], 00:13:11.029 "product_name": "passthru", 00:13:11.029 "block_size": 512, 00:13:11.029 "num_blocks": 65536, 00:13:11.029 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:11.029 "assigned_rate_limits": { 00:13:11.029 "rw_ios_per_sec": 0, 00:13:11.029 "rw_mbytes_per_sec": 0, 00:13:11.029 "r_mbytes_per_sec": 0, 00:13:11.029 "w_mbytes_per_sec": 0 00:13:11.029 }, 00:13:11.029 "claimed": true, 00:13:11.029 "claim_type": "exclusive_write", 00:13:11.029 "zoned": false, 00:13:11.029 "supported_io_types": { 00:13:11.029 "read": true, 00:13:11.029 "write": true, 00:13:11.029 "unmap": true, 00:13:11.029 "write_zeroes": true, 00:13:11.029 "flush": true, 00:13:11.029 "reset": true, 00:13:11.029 "compare": false, 00:13:11.029 "compare_and_write": false, 00:13:11.029 "abort": true, 00:13:11.029 "nvme_admin": false, 00:13:11.029 "nvme_io": false 00:13:11.029 }, 00:13:11.029 "memory_domains": [ 00:13:11.029 { 00:13:11.029 "dma_device_id": "system", 00:13:11.029 "dma_device_type": 1 00:13:11.029 }, 00:13:11.029 { 00:13:11.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.029 "dma_device_type": 2 00:13:11.029 } 00:13:11.029 ], 00:13:11.029 "driver_specific": { 00:13:11.029 "passthru": { 00:13:11.029 "name": "pt1", 00:13:11.029 "base_bdev_name": "malloc1" 00:13:11.029 } 00:13:11.029 } 00:13:11.029 }' 00:13:11.029 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.029 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.029 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:11.030 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.030 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.030 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:11.030 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.030 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:11.290 10:08:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.551 "name": "pt2", 00:13:11.551 "aliases": [ 00:13:11.551 "00000000-0000-0000-0000-000000000002" 00:13:11.551 ], 00:13:11.551 "product_name": "passthru", 00:13:11.551 "block_size": 512, 00:13:11.551 "num_blocks": 65536, 00:13:11.551 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:11.551 "assigned_rate_limits": { 00:13:11.551 "rw_ios_per_sec": 0, 00:13:11.551 "rw_mbytes_per_sec": 0, 00:13:11.551 "r_mbytes_per_sec": 0, 00:13:11.551 "w_mbytes_per_sec": 0 00:13:11.551 }, 00:13:11.551 "claimed": true, 00:13:11.551 "claim_type": "exclusive_write", 00:13:11.551 "zoned": false, 00:13:11.551 "supported_io_types": { 00:13:11.551 "read": true, 00:13:11.551 "write": true, 00:13:11.551 "unmap": true, 00:13:11.551 "write_zeroes": true, 00:13:11.551 "flush": true, 00:13:11.551 "reset": true, 00:13:11.551 "compare": false, 00:13:11.551 "compare_and_write": false, 00:13:11.551 "abort": true, 00:13:11.551 "nvme_admin": false, 00:13:11.551 "nvme_io": false 00:13:11.551 }, 00:13:11.551 "memory_domains": [ 00:13:11.551 { 00:13:11.551 "dma_device_id": "system", 00:13:11.551 "dma_device_type": 1 00:13:11.551 }, 00:13:11.551 { 00:13:11.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.551 "dma_device_type": 2 00:13:11.551 } 00:13:11.551 ], 00:13:11.551 "driver_specific": { 00:13:11.551 "passthru": { 00:13:11.551 "name": "pt2", 00:13:11.551 "base_bdev_name": "malloc2" 00:13:11.551 } 00:13:11.551 } 00:13:11.551 }' 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.551 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.813 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.813 "name": "pt3", 00:13:11.813 "aliases": [ 00:13:11.813 "00000000-0000-0000-0000-000000000003" 00:13:11.813 ], 00:13:11.813 "product_name": "passthru", 00:13:11.813 "block_size": 512, 00:13:11.813 "num_blocks": 65536, 00:13:11.813 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:11.813 "assigned_rate_limits": { 00:13:11.813 "rw_ios_per_sec": 0, 00:13:11.813 "rw_mbytes_per_sec": 0, 00:13:11.813 "r_mbytes_per_sec": 0, 00:13:11.813 "w_mbytes_per_sec": 0 00:13:11.813 }, 00:13:11.813 "claimed": true, 00:13:11.813 "claim_type": "exclusive_write", 00:13:11.813 "zoned": false, 00:13:11.813 "supported_io_types": { 00:13:11.813 "read": true, 00:13:11.813 "write": true, 00:13:11.813 "unmap": true, 00:13:11.813 "write_zeroes": true, 00:13:11.813 "flush": true, 00:13:11.813 "reset": true, 00:13:11.813 "compare": false, 00:13:11.813 "compare_and_write": false, 00:13:11.813 "abort": true, 00:13:11.813 "nvme_admin": false, 00:13:11.813 "nvme_io": false 00:13:11.813 }, 00:13:11.813 "memory_domains": [ 00:13:11.813 { 00:13:11.813 "dma_device_id": "system", 00:13:11.813 "dma_device_type": 1 00:13:11.813 }, 00:13:11.813 { 00:13:11.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.813 "dma_device_type": 2 00:13:11.813 } 00:13:11.813 ], 00:13:11.813 "driver_specific": { 00:13:11.813 "passthru": { 00:13:11.813 "name": "pt3", 00:13:11.813 "base_bdev_name": "malloc3" 00:13:11.814 } 00:13:11.814 } 00:13:11.814 }' 00:13:11.814 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.814 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.074 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.334 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.334 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:12.334 10:08:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:12.334 [2024-06-10 10:08:34.144455] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:12.334 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=79dd05fa-029a-4e57-8cbb-25b96bb53f96 00:13:12.334 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 79dd05fa-029a-4e57-8cbb-25b96bb53f96 ']' 00:13:12.334 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:12.594 [2024-06-10 10:08:34.336743] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:12.594 [2024-06-10 10:08:34.336755] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:12.594 [2024-06-10 10:08:34.336791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:12.594 [2024-06-10 10:08:34.336836] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:12.594 [2024-06-10 10:08:34.336842] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1541830 name raid_bdev1, state offline 00:13:12.594 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.594 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:12.854 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:12.854 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:12.854 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:12.854 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:13.114 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:13.114 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:13.114 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:13.114 10:08:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:13.374 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:13.374 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:13.634 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:13.635 [2024-06-10 10:08:35.459546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:13.635 [2024-06-10 10:08:35.460599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:13.635 [2024-06-10 10:08:35.460630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:13.635 [2024-06-10 10:08:35.460664] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:13.635 [2024-06-10 10:08:35.460690] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:13.635 [2024-06-10 10:08:35.460703] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:13.635 [2024-06-10 10:08:35.460713] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:13.635 [2024-06-10 10:08:35.460719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x153c900 name raid_bdev1, state configuring 00:13:13.635 request: 00:13:13.635 { 00:13:13.635 "name": "raid_bdev1", 00:13:13.635 "raid_level": "raid0", 00:13:13.635 "base_bdevs": [ 00:13:13.635 "malloc1", 00:13:13.635 "malloc2", 00:13:13.635 "malloc3" 00:13:13.635 ], 00:13:13.635 "superblock": false, 00:13:13.635 "strip_size_kb": 64, 00:13:13.635 "method": "bdev_raid_create", 00:13:13.635 "req_id": 1 00:13:13.635 } 00:13:13.635 Got JSON-RPC error response 00:13:13.635 response: 00:13:13.635 { 00:13:13.635 "code": -17, 00:13:13.635 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:13.635 } 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.635 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:13.896 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:13.896 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:13.896 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:14.156 [2024-06-10 10:08:35.840460] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:14.156 [2024-06-10 10:08:35.840484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:14.156 [2024-06-10 10:08:35.840494] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1542f50 00:13:14.156 [2024-06-10 10:08:35.840500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:14.156 [2024-06-10 10:08:35.841735] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:14.156 [2024-06-10 10:08:35.841754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:14.156 [2024-06-10 10:08:35.841797] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:14.156 [2024-06-10 10:08:35.841814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:14.156 pt1 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.156 10:08:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:14.416 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.416 "name": "raid_bdev1", 00:13:14.416 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:14.416 "strip_size_kb": 64, 00:13:14.416 "state": "configuring", 00:13:14.416 "raid_level": "raid0", 00:13:14.416 "superblock": true, 00:13:14.416 "num_base_bdevs": 3, 00:13:14.416 "num_base_bdevs_discovered": 1, 00:13:14.416 "num_base_bdevs_operational": 3, 00:13:14.416 "base_bdevs_list": [ 00:13:14.416 { 00:13:14.416 "name": "pt1", 00:13:14.416 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:14.416 "is_configured": true, 00:13:14.416 "data_offset": 2048, 00:13:14.416 "data_size": 63488 00:13:14.416 }, 00:13:14.416 { 00:13:14.416 "name": null, 00:13:14.416 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:14.416 "is_configured": false, 00:13:14.416 "data_offset": 2048, 00:13:14.416 "data_size": 63488 00:13:14.416 }, 00:13:14.416 { 00:13:14.416 "name": null, 00:13:14.416 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:14.416 "is_configured": false, 00:13:14.416 "data_offset": 2048, 00:13:14.416 "data_size": 63488 00:13:14.416 } 00:13:14.416 ] 00:13:14.416 }' 00:13:14.416 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.416 10:08:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.986 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:14.986 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:14.986 [2024-06-10 10:08:36.750762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:14.986 [2024-06-10 10:08:36.750790] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:14.986 [2024-06-10 10:08:36.750802] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153e5b0 00:13:14.986 [2024-06-10 10:08:36.750808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:14.986 [2024-06-10 10:08:36.751060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:14.986 [2024-06-10 10:08:36.751070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:14.986 [2024-06-10 10:08:36.751107] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:14.986 [2024-06-10 10:08:36.751119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:14.986 pt2 00:13:14.986 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:15.246 [2024-06-10 10:08:36.907163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.246 10:08:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:15.246 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.246 "name": "raid_bdev1", 00:13:15.246 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:15.246 "strip_size_kb": 64, 00:13:15.246 "state": "configuring", 00:13:15.246 "raid_level": "raid0", 00:13:15.246 "superblock": true, 00:13:15.246 "num_base_bdevs": 3, 00:13:15.246 "num_base_bdevs_discovered": 1, 00:13:15.246 "num_base_bdevs_operational": 3, 00:13:15.246 "base_bdevs_list": [ 00:13:15.246 { 00:13:15.246 "name": "pt1", 00:13:15.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:15.246 "is_configured": true, 00:13:15.246 "data_offset": 2048, 00:13:15.246 "data_size": 63488 00:13:15.246 }, 00:13:15.246 { 00:13:15.246 "name": null, 00:13:15.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:15.246 "is_configured": false, 00:13:15.246 "data_offset": 2048, 00:13:15.246 "data_size": 63488 00:13:15.246 }, 00:13:15.246 { 00:13:15.246 "name": null, 00:13:15.246 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:15.246 "is_configured": false, 00:13:15.246 "data_offset": 2048, 00:13:15.246 "data_size": 63488 00:13:15.246 } 00:13:15.246 ] 00:13:15.246 }' 00:13:15.246 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.246 10:08:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.815 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:15.815 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:15.815 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:16.076 [2024-06-10 10:08:37.833501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:16.076 [2024-06-10 10:08:37.833529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:16.076 [2024-06-10 10:08:37.833538] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15426a0 00:13:16.076 [2024-06-10 10:08:37.833544] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:16.076 [2024-06-10 10:08:37.833801] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:16.076 [2024-06-10 10:08:37.833811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:16.076 [2024-06-10 10:08:37.833857] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:16.076 [2024-06-10 10:08:37.833868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:16.076 pt2 00:13:16.076 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:16.076 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:16.076 10:08:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:16.337 [2024-06-10 10:08:38.021976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:16.337 [2024-06-10 10:08:38.021996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:16.337 [2024-06-10 10:08:38.022004] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1543570 00:13:16.337 [2024-06-10 10:08:38.022010] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:16.337 [2024-06-10 10:08:38.022237] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:16.337 [2024-06-10 10:08:38.022246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:16.337 [2024-06-10 10:08:38.022281] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:16.337 [2024-06-10 10:08:38.022291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:16.337 [2024-06-10 10:08:38.022369] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x153df50 00:13:16.337 [2024-06-10 10:08:38.022375] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:16.337 [2024-06-10 10:08:38.022507] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1540770 00:13:16.337 [2024-06-10 10:08:38.022602] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x153df50 00:13:16.337 [2024-06-10 10:08:38.022607] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x153df50 00:13:16.337 [2024-06-10 10:08:38.022677] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:16.337 pt3 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.337 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:16.597 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.597 "name": "raid_bdev1", 00:13:16.597 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:16.597 "strip_size_kb": 64, 00:13:16.597 "state": "online", 00:13:16.597 "raid_level": "raid0", 00:13:16.597 "superblock": true, 00:13:16.597 "num_base_bdevs": 3, 00:13:16.597 "num_base_bdevs_discovered": 3, 00:13:16.597 "num_base_bdevs_operational": 3, 00:13:16.597 "base_bdevs_list": [ 00:13:16.597 { 00:13:16.597 "name": "pt1", 00:13:16.597 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:16.597 "is_configured": true, 00:13:16.597 "data_offset": 2048, 00:13:16.597 "data_size": 63488 00:13:16.597 }, 00:13:16.597 { 00:13:16.597 "name": "pt2", 00:13:16.597 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:16.597 "is_configured": true, 00:13:16.597 "data_offset": 2048, 00:13:16.597 "data_size": 63488 00:13:16.597 }, 00:13:16.597 { 00:13:16.597 "name": "pt3", 00:13:16.597 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:16.597 "is_configured": true, 00:13:16.597 "data_offset": 2048, 00:13:16.597 "data_size": 63488 00:13:16.597 } 00:13:16.597 ] 00:13:16.597 }' 00:13:16.597 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.597 10:08:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:17.167 [2024-06-10 10:08:38.944495] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.167 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:17.167 "name": "raid_bdev1", 00:13:17.167 "aliases": [ 00:13:17.167 "79dd05fa-029a-4e57-8cbb-25b96bb53f96" 00:13:17.167 ], 00:13:17.167 "product_name": "Raid Volume", 00:13:17.167 "block_size": 512, 00:13:17.167 "num_blocks": 190464, 00:13:17.167 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:17.167 "assigned_rate_limits": { 00:13:17.167 "rw_ios_per_sec": 0, 00:13:17.167 "rw_mbytes_per_sec": 0, 00:13:17.167 "r_mbytes_per_sec": 0, 00:13:17.167 "w_mbytes_per_sec": 0 00:13:17.167 }, 00:13:17.167 "claimed": false, 00:13:17.167 "zoned": false, 00:13:17.167 "supported_io_types": { 00:13:17.167 "read": true, 00:13:17.167 "write": true, 00:13:17.167 "unmap": true, 00:13:17.167 "write_zeroes": true, 00:13:17.167 "flush": true, 00:13:17.167 "reset": true, 00:13:17.167 "compare": false, 00:13:17.167 "compare_and_write": false, 00:13:17.167 "abort": false, 00:13:17.167 "nvme_admin": false, 00:13:17.167 "nvme_io": false 00:13:17.167 }, 00:13:17.167 "memory_domains": [ 00:13:17.167 { 00:13:17.167 "dma_device_id": "system", 00:13:17.167 "dma_device_type": 1 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.167 "dma_device_type": 2 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "dma_device_id": "system", 00:13:17.167 "dma_device_type": 1 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.167 "dma_device_type": 2 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "dma_device_id": "system", 00:13:17.167 "dma_device_type": 1 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.167 "dma_device_type": 2 00:13:17.167 } 00:13:17.167 ], 00:13:17.167 "driver_specific": { 00:13:17.167 "raid": { 00:13:17.167 "uuid": "79dd05fa-029a-4e57-8cbb-25b96bb53f96", 00:13:17.167 "strip_size_kb": 64, 00:13:17.167 "state": "online", 00:13:17.167 "raid_level": "raid0", 00:13:17.167 "superblock": true, 00:13:17.167 "num_base_bdevs": 3, 00:13:17.167 "num_base_bdevs_discovered": 3, 00:13:17.167 "num_base_bdevs_operational": 3, 00:13:17.167 "base_bdevs_list": [ 00:13:17.167 { 00:13:17.167 "name": "pt1", 00:13:17.167 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:17.167 "is_configured": true, 00:13:17.167 "data_offset": 2048, 00:13:17.167 "data_size": 63488 00:13:17.167 }, 00:13:17.167 { 00:13:17.167 "name": "pt2", 00:13:17.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:17.168 "is_configured": true, 00:13:17.168 "data_offset": 2048, 00:13:17.168 "data_size": 63488 00:13:17.168 }, 00:13:17.168 { 00:13:17.168 "name": "pt3", 00:13:17.168 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:17.168 "is_configured": true, 00:13:17.168 "data_offset": 2048, 00:13:17.168 "data_size": 63488 00:13:17.168 } 00:13:17.168 ] 00:13:17.168 } 00:13:17.168 } 00:13:17.168 }' 00:13:17.168 10:08:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:17.168 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:17.168 pt2 00:13:17.168 pt3' 00:13:17.168 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.168 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:17.168 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:17.427 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:17.427 "name": "pt1", 00:13:17.427 "aliases": [ 00:13:17.427 "00000000-0000-0000-0000-000000000001" 00:13:17.427 ], 00:13:17.427 "product_name": "passthru", 00:13:17.427 "block_size": 512, 00:13:17.427 "num_blocks": 65536, 00:13:17.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:17.427 "assigned_rate_limits": { 00:13:17.427 "rw_ios_per_sec": 0, 00:13:17.427 "rw_mbytes_per_sec": 0, 00:13:17.427 "r_mbytes_per_sec": 0, 00:13:17.427 "w_mbytes_per_sec": 0 00:13:17.427 }, 00:13:17.427 "claimed": true, 00:13:17.427 "claim_type": "exclusive_write", 00:13:17.427 "zoned": false, 00:13:17.427 "supported_io_types": { 00:13:17.428 "read": true, 00:13:17.428 "write": true, 00:13:17.428 "unmap": true, 00:13:17.428 "write_zeroes": true, 00:13:17.428 "flush": true, 00:13:17.428 "reset": true, 00:13:17.428 "compare": false, 00:13:17.428 "compare_and_write": false, 00:13:17.428 "abort": true, 00:13:17.428 "nvme_admin": false, 00:13:17.428 "nvme_io": false 00:13:17.428 }, 00:13:17.428 "memory_domains": [ 00:13:17.428 { 00:13:17.428 "dma_device_id": "system", 00:13:17.428 "dma_device_type": 1 00:13:17.428 }, 00:13:17.428 { 00:13:17.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.428 "dma_device_type": 2 00:13:17.428 } 00:13:17.428 ], 00:13:17.428 "driver_specific": { 00:13:17.428 "passthru": { 00:13:17.428 "name": "pt1", 00:13:17.428 "base_bdev_name": "malloc1" 00:13:17.428 } 00:13:17.428 } 00:13:17.428 }' 00:13:17.428 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.428 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.428 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.428 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.428 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.687 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.687 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:17.688 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:17.948 "name": "pt2", 00:13:17.948 "aliases": [ 00:13:17.948 "00000000-0000-0000-0000-000000000002" 00:13:17.948 ], 00:13:17.948 "product_name": "passthru", 00:13:17.948 "block_size": 512, 00:13:17.948 "num_blocks": 65536, 00:13:17.948 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:17.948 "assigned_rate_limits": { 00:13:17.948 "rw_ios_per_sec": 0, 00:13:17.948 "rw_mbytes_per_sec": 0, 00:13:17.948 "r_mbytes_per_sec": 0, 00:13:17.948 "w_mbytes_per_sec": 0 00:13:17.948 }, 00:13:17.948 "claimed": true, 00:13:17.948 "claim_type": "exclusive_write", 00:13:17.948 "zoned": false, 00:13:17.948 "supported_io_types": { 00:13:17.948 "read": true, 00:13:17.948 "write": true, 00:13:17.948 "unmap": true, 00:13:17.948 "write_zeroes": true, 00:13:17.948 "flush": true, 00:13:17.948 "reset": true, 00:13:17.948 "compare": false, 00:13:17.948 "compare_and_write": false, 00:13:17.948 "abort": true, 00:13:17.948 "nvme_admin": false, 00:13:17.948 "nvme_io": false 00:13:17.948 }, 00:13:17.948 "memory_domains": [ 00:13:17.948 { 00:13:17.948 "dma_device_id": "system", 00:13:17.948 "dma_device_type": 1 00:13:17.948 }, 00:13:17.948 { 00:13:17.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.948 "dma_device_type": 2 00:13:17.948 } 00:13:17.948 ], 00:13:17.948 "driver_specific": { 00:13:17.948 "passthru": { 00:13:17.948 "name": "pt2", 00:13:17.948 "base_bdev_name": "malloc2" 00:13:17.948 } 00:13:17.948 } 00:13:17.948 }' 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.948 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.208 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:18.209 10:08:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.468 "name": "pt3", 00:13:18.468 "aliases": [ 00:13:18.468 "00000000-0000-0000-0000-000000000003" 00:13:18.468 ], 00:13:18.468 "product_name": "passthru", 00:13:18.468 "block_size": 512, 00:13:18.468 "num_blocks": 65536, 00:13:18.468 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:18.468 "assigned_rate_limits": { 00:13:18.468 "rw_ios_per_sec": 0, 00:13:18.468 "rw_mbytes_per_sec": 0, 00:13:18.468 "r_mbytes_per_sec": 0, 00:13:18.468 "w_mbytes_per_sec": 0 00:13:18.468 }, 00:13:18.468 "claimed": true, 00:13:18.468 "claim_type": "exclusive_write", 00:13:18.468 "zoned": false, 00:13:18.468 "supported_io_types": { 00:13:18.468 "read": true, 00:13:18.468 "write": true, 00:13:18.468 "unmap": true, 00:13:18.468 "write_zeroes": true, 00:13:18.468 "flush": true, 00:13:18.468 "reset": true, 00:13:18.468 "compare": false, 00:13:18.468 "compare_and_write": false, 00:13:18.468 "abort": true, 00:13:18.468 "nvme_admin": false, 00:13:18.468 "nvme_io": false 00:13:18.468 }, 00:13:18.468 "memory_domains": [ 00:13:18.468 { 00:13:18.468 "dma_device_id": "system", 00:13:18.468 "dma_device_type": 1 00:13:18.468 }, 00:13:18.468 { 00:13:18.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.468 "dma_device_type": 2 00:13:18.468 } 00:13:18.468 ], 00:13:18.468 "driver_specific": { 00:13:18.468 "passthru": { 00:13:18.468 "name": "pt3", 00:13:18.468 "base_bdev_name": "malloc3" 00:13:18.468 } 00:13:18.468 } 00:13:18.468 }' 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.468 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:18.728 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:18.987 [2024-06-10 10:08:40.684967] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 79dd05fa-029a-4e57-8cbb-25b96bb53f96 '!=' 79dd05fa-029a-4e57-8cbb-25b96bb53f96 ']' 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 988065 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 988065 ']' 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 988065 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 988065 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 988065' 00:13:18.987 killing process with pid 988065 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 988065 00:13:18.987 [2024-06-10 10:08:40.753395] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:18.987 [2024-06-10 10:08:40.753432] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:18.987 [2024-06-10 10:08:40.753470] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:18.987 [2024-06-10 10:08:40.753476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x153df50 name raid_bdev1, state offline 00:13:18.987 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 988065 00:13:18.987 [2024-06-10 10:08:40.768345] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:19.247 10:08:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:19.247 00:13:19.247 real 0m11.314s 00:13:19.247 user 0m20.850s 00:13:19.247 sys 0m1.655s 00:13:19.247 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:19.247 10:08:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.247 ************************************ 00:13:19.247 END TEST raid_superblock_test 00:13:19.247 ************************************ 00:13:19.247 10:08:40 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:13:19.247 10:08:40 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:19.247 10:08:40 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:19.247 10:08:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:19.247 ************************************ 00:13:19.247 START TEST raid_read_error_test 00:13:19.247 ************************************ 00:13:19.247 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 read 00:13:19.247 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:19.247 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:19.247 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:19.247 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6cNqXRbzR9 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=990201 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 990201 /var/tmp/spdk-raid.sock 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 990201 ']' 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:19.248 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:19.248 10:08:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.248 [2024-06-10 10:08:41.031818] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:13:19.248 [2024-06-10 10:08:41.031871] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid990201 ] 00:13:19.508 [2024-06-10 10:08:41.120177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:19.508 [2024-06-10 10:08:41.186166] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:19.508 [2024-06-10 10:08:41.235337] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:19.508 [2024-06-10 10:08:41.235360] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:20.121 10:08:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:20.121 10:08:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:20.121 10:08:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:20.121 10:08:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:20.404 BaseBdev1_malloc 00:13:20.404 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:20.404 true 00:13:20.404 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:20.663 [2024-06-10 10:08:42.426535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:20.663 [2024-06-10 10:08:42.426568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:20.663 [2024-06-10 10:08:42.426580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd7d10 00:13:20.663 [2024-06-10 10:08:42.426587] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:20.663 [2024-06-10 10:08:42.427950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:20.663 [2024-06-10 10:08:42.427969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:20.663 BaseBdev1 00:13:20.663 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:20.663 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:20.923 BaseBdev2_malloc 00:13:20.923 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:21.182 true 00:13:21.182 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:21.182 [2024-06-10 10:08:42.981908] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:21.182 [2024-06-10 10:08:42.981937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.182 [2024-06-10 10:08:42.981948] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdc710 00:13:21.182 [2024-06-10 10:08:42.981955] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.182 [2024-06-10 10:08:42.983137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.182 [2024-06-10 10:08:42.983164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:21.182 BaseBdev2 00:13:21.182 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:21.182 10:08:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:21.441 BaseBdev3_malloc 00:13:21.441 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:21.702 true 00:13:21.702 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:21.702 [2024-06-10 10:08:43.549245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:21.702 [2024-06-10 10:08:43.549274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.702 [2024-06-10 10:08:43.549284] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdd340 00:13:21.702 [2024-06-10 10:08:43.549290] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.702 [2024-06-10 10:08:43.550470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.702 [2024-06-10 10:08:43.550489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:21.702 BaseBdev3 00:13:21.702 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:21.962 [2024-06-10 10:08:43.737741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:21.962 [2024-06-10 10:08:43.738744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:21.962 [2024-06-10 10:08:43.738798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:21.962 [2024-06-10 10:08:43.738964] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xce0160 00:13:21.962 [2024-06-10 10:08:43.738971] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:21.962 [2024-06-10 10:08:43.739114] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd96e0 00:13:21.962 [2024-06-10 10:08:43.739230] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xce0160 00:13:21.962 [2024-06-10 10:08:43.739235] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xce0160 00:13:21.962 [2024-06-10 10:08:43.739308] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.962 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.222 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.222 "name": "raid_bdev1", 00:13:22.222 "uuid": "c84db1c8-d4eb-40b1-a0d0-3fdf1fa8ffad", 00:13:22.222 "strip_size_kb": 64, 00:13:22.222 "state": "online", 00:13:22.222 "raid_level": "raid0", 00:13:22.222 "superblock": true, 00:13:22.222 "num_base_bdevs": 3, 00:13:22.222 "num_base_bdevs_discovered": 3, 00:13:22.222 "num_base_bdevs_operational": 3, 00:13:22.222 "base_bdevs_list": [ 00:13:22.222 { 00:13:22.222 "name": "BaseBdev1", 00:13:22.222 "uuid": "8cc13869-663e-5018-864b-611508f0cf3d", 00:13:22.222 "is_configured": true, 00:13:22.222 "data_offset": 2048, 00:13:22.222 "data_size": 63488 00:13:22.222 }, 00:13:22.222 { 00:13:22.222 "name": "BaseBdev2", 00:13:22.222 "uuid": "63b0fe3e-7d26-51a2-a683-352587e10c28", 00:13:22.222 "is_configured": true, 00:13:22.222 "data_offset": 2048, 00:13:22.222 "data_size": 63488 00:13:22.222 }, 00:13:22.222 { 00:13:22.222 "name": "BaseBdev3", 00:13:22.222 "uuid": "1c6f711c-481a-530f-a514-3823ce515f57", 00:13:22.222 "is_configured": true, 00:13:22.222 "data_offset": 2048, 00:13:22.222 "data_size": 63488 00:13:22.222 } 00:13:22.222 ] 00:13:22.222 }' 00:13:22.222 10:08:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.222 10:08:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.793 10:08:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:22.793 10:08:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:22.793 [2024-06-10 10:08:44.568015] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x834d40 00:13:23.735 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.996 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.257 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.257 "name": "raid_bdev1", 00:13:24.257 "uuid": "c84db1c8-d4eb-40b1-a0d0-3fdf1fa8ffad", 00:13:24.257 "strip_size_kb": 64, 00:13:24.257 "state": "online", 00:13:24.257 "raid_level": "raid0", 00:13:24.257 "superblock": true, 00:13:24.257 "num_base_bdevs": 3, 00:13:24.257 "num_base_bdevs_discovered": 3, 00:13:24.257 "num_base_bdevs_operational": 3, 00:13:24.257 "base_bdevs_list": [ 00:13:24.257 { 00:13:24.257 "name": "BaseBdev1", 00:13:24.257 "uuid": "8cc13869-663e-5018-864b-611508f0cf3d", 00:13:24.257 "is_configured": true, 00:13:24.257 "data_offset": 2048, 00:13:24.257 "data_size": 63488 00:13:24.257 }, 00:13:24.257 { 00:13:24.257 "name": "BaseBdev2", 00:13:24.257 "uuid": "63b0fe3e-7d26-51a2-a683-352587e10c28", 00:13:24.257 "is_configured": true, 00:13:24.257 "data_offset": 2048, 00:13:24.257 "data_size": 63488 00:13:24.257 }, 00:13:24.257 { 00:13:24.257 "name": "BaseBdev3", 00:13:24.257 "uuid": "1c6f711c-481a-530f-a514-3823ce515f57", 00:13:24.257 "is_configured": true, 00:13:24.257 "data_offset": 2048, 00:13:24.257 "data_size": 63488 00:13:24.257 } 00:13:24.257 ] 00:13:24.257 }' 00:13:24.257 10:08:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.257 10:08:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:24.828 [2024-06-10 10:08:46.576316] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:24.828 [2024-06-10 10:08:46.576350] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:24.828 [2024-06-10 10:08:46.578945] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:24.828 [2024-06-10 10:08:46.578970] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:24.828 [2024-06-10 10:08:46.578996] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:24.828 [2024-06-10 10:08:46.579002] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce0160 name raid_bdev1, state offline 00:13:24.828 0 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 990201 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 990201 ']' 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 990201 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 990201 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 990201' 00:13:24.828 killing process with pid 990201 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 990201 00:13:24.828 [2024-06-10 10:08:46.646251] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:24.828 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 990201 00:13:24.828 [2024-06-10 10:08:46.657270] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6cNqXRbzR9 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:13:25.090 00:13:25.090 real 0m5.826s 00:13:25.090 user 0m9.279s 00:13:25.090 sys 0m0.800s 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:25.090 10:08:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.090 ************************************ 00:13:25.090 END TEST raid_read_error_test 00:13:25.090 ************************************ 00:13:25.090 10:08:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:13:25.090 10:08:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:25.090 10:08:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:25.090 10:08:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:25.090 ************************************ 00:13:25.090 START TEST raid_write_error_test 00:13:25.090 ************************************ 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 write 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CgquuLBLOq 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=991281 00:13:25.090 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 991281 /var/tmp/spdk-raid.sock 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 991281 ']' 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:25.091 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:25.091 10:08:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.091 [2024-06-10 10:08:46.937763] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:13:25.091 [2024-06-10 10:08:46.937817] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991281 ] 00:13:25.352 [2024-06-10 10:08:47.028399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:25.352 [2024-06-10 10:08:47.096569] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:25.352 [2024-06-10 10:08:47.146642] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:25.352 [2024-06-10 10:08:47.146667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:25.924 10:08:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:25.924 10:08:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:25.924 10:08:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:25.924 10:08:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:26.185 BaseBdev1_malloc 00:13:26.185 10:08:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:26.445 true 00:13:26.445 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:26.705 [2024-06-10 10:08:48.317406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:26.705 [2024-06-10 10:08:48.317437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:26.705 [2024-06-10 10:08:48.317449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141bd10 00:13:26.705 [2024-06-10 10:08:48.317455] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:26.705 [2024-06-10 10:08:48.318783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:26.705 [2024-06-10 10:08:48.318803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:26.705 BaseBdev1 00:13:26.705 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:26.705 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:26.705 BaseBdev2_malloc 00:13:26.705 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:26.965 true 00:13:26.965 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:27.225 [2024-06-10 10:08:48.904554] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:27.225 [2024-06-10 10:08:48.904584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.225 [2024-06-10 10:08:48.904597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1420710 00:13:27.225 [2024-06-10 10:08:48.904604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.225 [2024-06-10 10:08:48.905791] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.225 [2024-06-10 10:08:48.905811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:27.225 BaseBdev2 00:13:27.225 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:27.225 10:08:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:27.485 BaseBdev3_malloc 00:13:27.485 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:27.485 true 00:13:27.485 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:27.745 [2024-06-10 10:08:49.459852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:27.745 [2024-06-10 10:08:49.459878] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.745 [2024-06-10 10:08:49.459889] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1421340 00:13:27.745 [2024-06-10 10:08:49.459895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.745 [2024-06-10 10:08:49.461056] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.745 [2024-06-10 10:08:49.461074] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:27.745 BaseBdev3 00:13:27.745 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:28.006 [2024-06-10 10:08:49.648351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:28.006 [2024-06-10 10:08:49.649358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:28.006 [2024-06-10 10:08:49.649412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:28.006 [2024-06-10 10:08:49.649568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1424160 00:13:28.006 [2024-06-10 10:08:49.649575] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:28.006 [2024-06-10 10:08:49.649721] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x141d6e0 00:13:28.006 [2024-06-10 10:08:49.649841] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1424160 00:13:28.006 [2024-06-10 10:08:49.649851] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1424160 00:13:28.006 [2024-06-10 10:08:49.649925] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.006 "name": "raid_bdev1", 00:13:28.006 "uuid": "71927ed1-217c-43d6-ae30-2a72a903a91b", 00:13:28.006 "strip_size_kb": 64, 00:13:28.006 "state": "online", 00:13:28.006 "raid_level": "raid0", 00:13:28.006 "superblock": true, 00:13:28.006 "num_base_bdevs": 3, 00:13:28.006 "num_base_bdevs_discovered": 3, 00:13:28.006 "num_base_bdevs_operational": 3, 00:13:28.006 "base_bdevs_list": [ 00:13:28.006 { 00:13:28.006 "name": "BaseBdev1", 00:13:28.006 "uuid": "22c979fc-a176-5119-98fc-c7df3dc12536", 00:13:28.006 "is_configured": true, 00:13:28.006 "data_offset": 2048, 00:13:28.006 "data_size": 63488 00:13:28.006 }, 00:13:28.006 { 00:13:28.006 "name": "BaseBdev2", 00:13:28.006 "uuid": "ef92a2ed-1480-5ee0-8158-83952c14ebbf", 00:13:28.006 "is_configured": true, 00:13:28.006 "data_offset": 2048, 00:13:28.006 "data_size": 63488 00:13:28.006 }, 00:13:28.006 { 00:13:28.006 "name": "BaseBdev3", 00:13:28.006 "uuid": "b0de84a1-1cc6-52c9-9bdd-e2bb58ad3850", 00:13:28.006 "is_configured": true, 00:13:28.006 "data_offset": 2048, 00:13:28.006 "data_size": 63488 00:13:28.006 } 00:13:28.006 ] 00:13:28.006 }' 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.006 10:08:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.576 10:08:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:28.577 10:08:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:28.837 [2024-06-10 10:08:50.494665] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf78d40 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.779 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.780 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.040 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.040 "name": "raid_bdev1", 00:13:30.040 "uuid": "71927ed1-217c-43d6-ae30-2a72a903a91b", 00:13:30.040 "strip_size_kb": 64, 00:13:30.040 "state": "online", 00:13:30.040 "raid_level": "raid0", 00:13:30.040 "superblock": true, 00:13:30.040 "num_base_bdevs": 3, 00:13:30.040 "num_base_bdevs_discovered": 3, 00:13:30.040 "num_base_bdevs_operational": 3, 00:13:30.040 "base_bdevs_list": [ 00:13:30.040 { 00:13:30.040 "name": "BaseBdev1", 00:13:30.040 "uuid": "22c979fc-a176-5119-98fc-c7df3dc12536", 00:13:30.040 "is_configured": true, 00:13:30.040 "data_offset": 2048, 00:13:30.040 "data_size": 63488 00:13:30.040 }, 00:13:30.040 { 00:13:30.040 "name": "BaseBdev2", 00:13:30.040 "uuid": "ef92a2ed-1480-5ee0-8158-83952c14ebbf", 00:13:30.040 "is_configured": true, 00:13:30.040 "data_offset": 2048, 00:13:30.040 "data_size": 63488 00:13:30.040 }, 00:13:30.040 { 00:13:30.040 "name": "BaseBdev3", 00:13:30.040 "uuid": "b0de84a1-1cc6-52c9-9bdd-e2bb58ad3850", 00:13:30.040 "is_configured": true, 00:13:30.040 "data_offset": 2048, 00:13:30.040 "data_size": 63488 00:13:30.040 } 00:13:30.040 ] 00:13:30.040 }' 00:13:30.040 10:08:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.040 10:08:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.611 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:30.872 [2024-06-10 10:08:52.506321] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:30.872 [2024-06-10 10:08:52.506348] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:30.872 [2024-06-10 10:08:52.508932] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:30.872 [2024-06-10 10:08:52.508958] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.872 [2024-06-10 10:08:52.508985] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:30.872 [2024-06-10 10:08:52.508991] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1424160 name raid_bdev1, state offline 00:13:30.872 0 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 991281 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 991281 ']' 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 991281 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 991281 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 991281' 00:13:30.872 killing process with pid 991281 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 991281 00:13:30.872 [2024-06-10 10:08:52.576450] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 991281 00:13:30.872 [2024-06-10 10:08:52.587538] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CgquuLBLOq 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:13:30.872 00:13:30.872 real 0m5.853s 00:13:30.872 user 0m9.302s 00:13:30.872 sys 0m0.808s 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:30.872 10:08:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.872 ************************************ 00:13:30.872 END TEST raid_write_error_test 00:13:30.872 ************************************ 00:13:31.134 10:08:52 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:31.134 10:08:52 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:13:31.134 10:08:52 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:31.134 10:08:52 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:31.134 10:08:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:31.134 ************************************ 00:13:31.134 START TEST raid_state_function_test 00:13:31.134 ************************************ 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 false 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=992536 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 992536' 00:13:31.134 Process raid pid: 992536 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 992536 /var/tmp/spdk-raid.sock 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 992536 ']' 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:31.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:31.134 10:08:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.134 [2024-06-10 10:08:52.860352] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:13:31.134 [2024-06-10 10:08:52.860397] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:31.134 [2024-06-10 10:08:52.947914] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:31.395 [2024-06-10 10:08:53.012553] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:31.395 [2024-06-10 10:08:53.051701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.395 [2024-06-10 10:08:53.051723] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.968 10:08:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:31.968 10:08:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:13:31.968 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:32.229 [2024-06-10 10:08:53.866530] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:32.229 [2024-06-10 10:08:53.866560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:32.229 [2024-06-10 10:08:53.866566] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:32.229 [2024-06-10 10:08:53.866572] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:32.229 [2024-06-10 10:08:53.866577] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:32.229 [2024-06-10 10:08:53.866582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.229 10:08:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.229 10:08:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.229 "name": "Existed_Raid", 00:13:32.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.230 "strip_size_kb": 64, 00:13:32.230 "state": "configuring", 00:13:32.230 "raid_level": "concat", 00:13:32.230 "superblock": false, 00:13:32.230 "num_base_bdevs": 3, 00:13:32.230 "num_base_bdevs_discovered": 0, 00:13:32.230 "num_base_bdevs_operational": 3, 00:13:32.230 "base_bdevs_list": [ 00:13:32.230 { 00:13:32.230 "name": "BaseBdev1", 00:13:32.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.230 "is_configured": false, 00:13:32.230 "data_offset": 0, 00:13:32.230 "data_size": 0 00:13:32.230 }, 00:13:32.230 { 00:13:32.230 "name": "BaseBdev2", 00:13:32.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.230 "is_configured": false, 00:13:32.230 "data_offset": 0, 00:13:32.230 "data_size": 0 00:13:32.230 }, 00:13:32.230 { 00:13:32.230 "name": "BaseBdev3", 00:13:32.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.230 "is_configured": false, 00:13:32.230 "data_offset": 0, 00:13:32.230 "data_size": 0 00:13:32.230 } 00:13:32.230 ] 00:13:32.230 }' 00:13:32.230 10:08:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.230 10:08:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.802 10:08:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:33.062 [2024-06-10 10:08:54.780734] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:33.062 [2024-06-10 10:08:54.780749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x99bb00 name Existed_Raid, state configuring 00:13:33.062 10:08:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:33.323 [2024-06-10 10:08:54.969230] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:33.323 [2024-06-10 10:08:54.969248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:33.323 [2024-06-10 10:08:54.969254] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:33.323 [2024-06-10 10:08:54.969259] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:33.323 [2024-06-10 10:08:54.969263] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:33.323 [2024-06-10 10:08:54.969269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:33.323 10:08:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:33.323 [2024-06-10 10:08:55.164435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:33.323 BaseBdev1 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:33.323 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:33.583 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:33.844 [ 00:13:33.844 { 00:13:33.844 "name": "BaseBdev1", 00:13:33.844 "aliases": [ 00:13:33.844 "54b539f1-4693-4375-b894-da2ceb0a960b" 00:13:33.844 ], 00:13:33.844 "product_name": "Malloc disk", 00:13:33.844 "block_size": 512, 00:13:33.844 "num_blocks": 65536, 00:13:33.844 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:33.844 "assigned_rate_limits": { 00:13:33.844 "rw_ios_per_sec": 0, 00:13:33.844 "rw_mbytes_per_sec": 0, 00:13:33.844 "r_mbytes_per_sec": 0, 00:13:33.844 "w_mbytes_per_sec": 0 00:13:33.844 }, 00:13:33.844 "claimed": true, 00:13:33.844 "claim_type": "exclusive_write", 00:13:33.844 "zoned": false, 00:13:33.844 "supported_io_types": { 00:13:33.844 "read": true, 00:13:33.844 "write": true, 00:13:33.844 "unmap": true, 00:13:33.844 "write_zeroes": true, 00:13:33.844 "flush": true, 00:13:33.844 "reset": true, 00:13:33.844 "compare": false, 00:13:33.844 "compare_and_write": false, 00:13:33.844 "abort": true, 00:13:33.844 "nvme_admin": false, 00:13:33.844 "nvme_io": false 00:13:33.844 }, 00:13:33.844 "memory_domains": [ 00:13:33.844 { 00:13:33.844 "dma_device_id": "system", 00:13:33.844 "dma_device_type": 1 00:13:33.844 }, 00:13:33.844 { 00:13:33.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.844 "dma_device_type": 2 00:13:33.844 } 00:13:33.844 ], 00:13:33.844 "driver_specific": {} 00:13:33.844 } 00:13:33.844 ] 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.844 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.106 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.106 "name": "Existed_Raid", 00:13:34.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.106 "strip_size_kb": 64, 00:13:34.106 "state": "configuring", 00:13:34.106 "raid_level": "concat", 00:13:34.106 "superblock": false, 00:13:34.106 "num_base_bdevs": 3, 00:13:34.106 "num_base_bdevs_discovered": 1, 00:13:34.106 "num_base_bdevs_operational": 3, 00:13:34.106 "base_bdevs_list": [ 00:13:34.106 { 00:13:34.106 "name": "BaseBdev1", 00:13:34.106 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:34.106 "is_configured": true, 00:13:34.106 "data_offset": 0, 00:13:34.106 "data_size": 65536 00:13:34.106 }, 00:13:34.106 { 00:13:34.106 "name": "BaseBdev2", 00:13:34.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.106 "is_configured": false, 00:13:34.106 "data_offset": 0, 00:13:34.106 "data_size": 0 00:13:34.106 }, 00:13:34.106 { 00:13:34.106 "name": "BaseBdev3", 00:13:34.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.106 "is_configured": false, 00:13:34.106 "data_offset": 0, 00:13:34.106 "data_size": 0 00:13:34.106 } 00:13:34.106 ] 00:13:34.106 }' 00:13:34.106 10:08:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.106 10:08:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.678 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:34.678 [2024-06-10 10:08:56.451673] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:34.678 [2024-06-10 10:08:56.451699] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x99b3f0 name Existed_Raid, state configuring 00:13:34.678 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:34.939 [2024-06-10 10:08:56.640177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:34.939 [2024-06-10 10:08:56.641309] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:34.939 [2024-06-10 10:08:56.641332] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:34.939 [2024-06-10 10:08:56.641338] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:34.939 [2024-06-10 10:08:56.641343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.939 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.200 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.200 "name": "Existed_Raid", 00:13:35.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.200 "strip_size_kb": 64, 00:13:35.200 "state": "configuring", 00:13:35.200 "raid_level": "concat", 00:13:35.200 "superblock": false, 00:13:35.200 "num_base_bdevs": 3, 00:13:35.200 "num_base_bdevs_discovered": 1, 00:13:35.200 "num_base_bdevs_operational": 3, 00:13:35.200 "base_bdevs_list": [ 00:13:35.200 { 00:13:35.200 "name": "BaseBdev1", 00:13:35.200 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:35.200 "is_configured": true, 00:13:35.200 "data_offset": 0, 00:13:35.200 "data_size": 65536 00:13:35.200 }, 00:13:35.200 { 00:13:35.200 "name": "BaseBdev2", 00:13:35.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.200 "is_configured": false, 00:13:35.200 "data_offset": 0, 00:13:35.200 "data_size": 0 00:13:35.200 }, 00:13:35.200 { 00:13:35.200 "name": "BaseBdev3", 00:13:35.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.200 "is_configured": false, 00:13:35.200 "data_offset": 0, 00:13:35.200 "data_size": 0 00:13:35.200 } 00:13:35.200 ] 00:13:35.200 }' 00:13:35.200 10:08:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.200 10:08:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:35.770 [2024-06-10 10:08:57.527362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:35.770 BaseBdev2 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:35.770 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:36.030 [ 00:13:36.030 { 00:13:36.030 "name": "BaseBdev2", 00:13:36.030 "aliases": [ 00:13:36.030 "193696c4-d5c9-4a42-aa31-df72bbf77680" 00:13:36.030 ], 00:13:36.030 "product_name": "Malloc disk", 00:13:36.030 "block_size": 512, 00:13:36.030 "num_blocks": 65536, 00:13:36.030 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:36.030 "assigned_rate_limits": { 00:13:36.030 "rw_ios_per_sec": 0, 00:13:36.030 "rw_mbytes_per_sec": 0, 00:13:36.030 "r_mbytes_per_sec": 0, 00:13:36.030 "w_mbytes_per_sec": 0 00:13:36.030 }, 00:13:36.030 "claimed": true, 00:13:36.030 "claim_type": "exclusive_write", 00:13:36.030 "zoned": false, 00:13:36.030 "supported_io_types": { 00:13:36.030 "read": true, 00:13:36.030 "write": true, 00:13:36.030 "unmap": true, 00:13:36.030 "write_zeroes": true, 00:13:36.030 "flush": true, 00:13:36.030 "reset": true, 00:13:36.030 "compare": false, 00:13:36.030 "compare_and_write": false, 00:13:36.030 "abort": true, 00:13:36.030 "nvme_admin": false, 00:13:36.030 "nvme_io": false 00:13:36.030 }, 00:13:36.030 "memory_domains": [ 00:13:36.030 { 00:13:36.030 "dma_device_id": "system", 00:13:36.030 "dma_device_type": 1 00:13:36.030 }, 00:13:36.030 { 00:13:36.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.030 "dma_device_type": 2 00:13:36.030 } 00:13:36.030 ], 00:13:36.030 "driver_specific": {} 00:13:36.030 } 00:13:36.030 ] 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.030 10:08:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.290 10:08:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.290 "name": "Existed_Raid", 00:13:36.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.290 "strip_size_kb": 64, 00:13:36.290 "state": "configuring", 00:13:36.290 "raid_level": "concat", 00:13:36.290 "superblock": false, 00:13:36.290 "num_base_bdevs": 3, 00:13:36.290 "num_base_bdevs_discovered": 2, 00:13:36.290 "num_base_bdevs_operational": 3, 00:13:36.290 "base_bdevs_list": [ 00:13:36.290 { 00:13:36.290 "name": "BaseBdev1", 00:13:36.290 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:36.290 "is_configured": true, 00:13:36.290 "data_offset": 0, 00:13:36.290 "data_size": 65536 00:13:36.290 }, 00:13:36.290 { 00:13:36.290 "name": "BaseBdev2", 00:13:36.290 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:36.290 "is_configured": true, 00:13:36.290 "data_offset": 0, 00:13:36.290 "data_size": 65536 00:13:36.290 }, 00:13:36.290 { 00:13:36.290 "name": "BaseBdev3", 00:13:36.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.290 "is_configured": false, 00:13:36.290 "data_offset": 0, 00:13:36.290 "data_size": 0 00:13:36.290 } 00:13:36.290 ] 00:13:36.290 }' 00:13:36.290 10:08:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.290 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.860 10:08:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:37.120 [2024-06-10 10:08:58.759306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:37.120 [2024-06-10 10:08:58.759330] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x99c2c0 00:13:37.120 [2024-06-10 10:08:58.759334] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:37.120 [2024-06-10 10:08:58.759475] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb3f970 00:13:37.120 [2024-06-10 10:08:58.759565] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x99c2c0 00:13:37.120 [2024-06-10 10:08:58.759570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x99c2c0 00:13:37.120 [2024-06-10 10:08:58.759686] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.120 BaseBdev3 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.120 10:08:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:37.380 [ 00:13:37.380 { 00:13:37.380 "name": "BaseBdev3", 00:13:37.380 "aliases": [ 00:13:37.380 "f6d76282-8fea-4cf2-81ca-08f0583a000e" 00:13:37.380 ], 00:13:37.380 "product_name": "Malloc disk", 00:13:37.380 "block_size": 512, 00:13:37.380 "num_blocks": 65536, 00:13:37.380 "uuid": "f6d76282-8fea-4cf2-81ca-08f0583a000e", 00:13:37.380 "assigned_rate_limits": { 00:13:37.380 "rw_ios_per_sec": 0, 00:13:37.380 "rw_mbytes_per_sec": 0, 00:13:37.380 "r_mbytes_per_sec": 0, 00:13:37.380 "w_mbytes_per_sec": 0 00:13:37.380 }, 00:13:37.380 "claimed": true, 00:13:37.380 "claim_type": "exclusive_write", 00:13:37.380 "zoned": false, 00:13:37.380 "supported_io_types": { 00:13:37.380 "read": true, 00:13:37.380 "write": true, 00:13:37.380 "unmap": true, 00:13:37.380 "write_zeroes": true, 00:13:37.380 "flush": true, 00:13:37.380 "reset": true, 00:13:37.380 "compare": false, 00:13:37.380 "compare_and_write": false, 00:13:37.380 "abort": true, 00:13:37.380 "nvme_admin": false, 00:13:37.380 "nvme_io": false 00:13:37.380 }, 00:13:37.380 "memory_domains": [ 00:13:37.380 { 00:13:37.380 "dma_device_id": "system", 00:13:37.380 "dma_device_type": 1 00:13:37.380 }, 00:13:37.380 { 00:13:37.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.380 "dma_device_type": 2 00:13:37.380 } 00:13:37.380 ], 00:13:37.380 "driver_specific": {} 00:13:37.380 } 00:13:37.380 ] 00:13:37.380 10:08:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:37.380 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:37.380 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:37.380 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:37.380 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.381 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.641 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.641 "name": "Existed_Raid", 00:13:37.641 "uuid": "721a9f1b-5826-43a6-a69a-7a9304735c77", 00:13:37.641 "strip_size_kb": 64, 00:13:37.641 "state": "online", 00:13:37.641 "raid_level": "concat", 00:13:37.641 "superblock": false, 00:13:37.641 "num_base_bdevs": 3, 00:13:37.641 "num_base_bdevs_discovered": 3, 00:13:37.641 "num_base_bdevs_operational": 3, 00:13:37.641 "base_bdevs_list": [ 00:13:37.641 { 00:13:37.641 "name": "BaseBdev1", 00:13:37.641 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:37.641 "is_configured": true, 00:13:37.641 "data_offset": 0, 00:13:37.641 "data_size": 65536 00:13:37.641 }, 00:13:37.641 { 00:13:37.641 "name": "BaseBdev2", 00:13:37.641 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:37.641 "is_configured": true, 00:13:37.641 "data_offset": 0, 00:13:37.641 "data_size": 65536 00:13:37.641 }, 00:13:37.641 { 00:13:37.641 "name": "BaseBdev3", 00:13:37.641 "uuid": "f6d76282-8fea-4cf2-81ca-08f0583a000e", 00:13:37.641 "is_configured": true, 00:13:37.641 "data_offset": 0, 00:13:37.641 "data_size": 65536 00:13:37.641 } 00:13:37.641 ] 00:13:37.641 }' 00:13:37.641 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.641 10:08:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:38.211 10:08:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:38.211 [2024-06-10 10:09:00.054801] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:38.211 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:38.211 "name": "Existed_Raid", 00:13:38.211 "aliases": [ 00:13:38.211 "721a9f1b-5826-43a6-a69a-7a9304735c77" 00:13:38.211 ], 00:13:38.211 "product_name": "Raid Volume", 00:13:38.211 "block_size": 512, 00:13:38.211 "num_blocks": 196608, 00:13:38.211 "uuid": "721a9f1b-5826-43a6-a69a-7a9304735c77", 00:13:38.211 "assigned_rate_limits": { 00:13:38.211 "rw_ios_per_sec": 0, 00:13:38.211 "rw_mbytes_per_sec": 0, 00:13:38.211 "r_mbytes_per_sec": 0, 00:13:38.211 "w_mbytes_per_sec": 0 00:13:38.211 }, 00:13:38.211 "claimed": false, 00:13:38.211 "zoned": false, 00:13:38.211 "supported_io_types": { 00:13:38.211 "read": true, 00:13:38.212 "write": true, 00:13:38.212 "unmap": true, 00:13:38.212 "write_zeroes": true, 00:13:38.212 "flush": true, 00:13:38.212 "reset": true, 00:13:38.212 "compare": false, 00:13:38.212 "compare_and_write": false, 00:13:38.212 "abort": false, 00:13:38.212 "nvme_admin": false, 00:13:38.212 "nvme_io": false 00:13:38.212 }, 00:13:38.212 "memory_domains": [ 00:13:38.212 { 00:13:38.212 "dma_device_id": "system", 00:13:38.212 "dma_device_type": 1 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.212 "dma_device_type": 2 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "dma_device_id": "system", 00:13:38.212 "dma_device_type": 1 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.212 "dma_device_type": 2 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "dma_device_id": "system", 00:13:38.212 "dma_device_type": 1 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.212 "dma_device_type": 2 00:13:38.212 } 00:13:38.212 ], 00:13:38.212 "driver_specific": { 00:13:38.212 "raid": { 00:13:38.212 "uuid": "721a9f1b-5826-43a6-a69a-7a9304735c77", 00:13:38.212 "strip_size_kb": 64, 00:13:38.212 "state": "online", 00:13:38.212 "raid_level": "concat", 00:13:38.212 "superblock": false, 00:13:38.212 "num_base_bdevs": 3, 00:13:38.212 "num_base_bdevs_discovered": 3, 00:13:38.212 "num_base_bdevs_operational": 3, 00:13:38.212 "base_bdevs_list": [ 00:13:38.212 { 00:13:38.212 "name": "BaseBdev1", 00:13:38.212 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:38.212 "is_configured": true, 00:13:38.212 "data_offset": 0, 00:13:38.212 "data_size": 65536 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "name": "BaseBdev2", 00:13:38.212 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:38.212 "is_configured": true, 00:13:38.212 "data_offset": 0, 00:13:38.212 "data_size": 65536 00:13:38.212 }, 00:13:38.212 { 00:13:38.212 "name": "BaseBdev3", 00:13:38.212 "uuid": "f6d76282-8fea-4cf2-81ca-08f0583a000e", 00:13:38.212 "is_configured": true, 00:13:38.212 "data_offset": 0, 00:13:38.212 "data_size": 65536 00:13:38.212 } 00:13:38.212 ] 00:13:38.212 } 00:13:38.212 } 00:13:38.212 }' 00:13:38.212 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:38.471 BaseBdev2 00:13:38.471 BaseBdev3' 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.471 "name": "BaseBdev1", 00:13:38.471 "aliases": [ 00:13:38.471 "54b539f1-4693-4375-b894-da2ceb0a960b" 00:13:38.471 ], 00:13:38.471 "product_name": "Malloc disk", 00:13:38.471 "block_size": 512, 00:13:38.471 "num_blocks": 65536, 00:13:38.471 "uuid": "54b539f1-4693-4375-b894-da2ceb0a960b", 00:13:38.471 "assigned_rate_limits": { 00:13:38.471 "rw_ios_per_sec": 0, 00:13:38.471 "rw_mbytes_per_sec": 0, 00:13:38.471 "r_mbytes_per_sec": 0, 00:13:38.471 "w_mbytes_per_sec": 0 00:13:38.471 }, 00:13:38.471 "claimed": true, 00:13:38.471 "claim_type": "exclusive_write", 00:13:38.471 "zoned": false, 00:13:38.471 "supported_io_types": { 00:13:38.471 "read": true, 00:13:38.471 "write": true, 00:13:38.471 "unmap": true, 00:13:38.471 "write_zeroes": true, 00:13:38.471 "flush": true, 00:13:38.471 "reset": true, 00:13:38.471 "compare": false, 00:13:38.471 "compare_and_write": false, 00:13:38.471 "abort": true, 00:13:38.471 "nvme_admin": false, 00:13:38.471 "nvme_io": false 00:13:38.471 }, 00:13:38.471 "memory_domains": [ 00:13:38.471 { 00:13:38.471 "dma_device_id": "system", 00:13:38.471 "dma_device_type": 1 00:13:38.471 }, 00:13:38.471 { 00:13:38.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.471 "dma_device_type": 2 00:13:38.471 } 00:13:38.471 ], 00:13:38.471 "driver_specific": {} 00:13:38.471 }' 00:13:38.471 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.731 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:38.991 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.991 "name": "BaseBdev2", 00:13:38.991 "aliases": [ 00:13:38.991 "193696c4-d5c9-4a42-aa31-df72bbf77680" 00:13:38.991 ], 00:13:38.991 "product_name": "Malloc disk", 00:13:38.991 "block_size": 512, 00:13:38.991 "num_blocks": 65536, 00:13:38.991 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:38.991 "assigned_rate_limits": { 00:13:38.991 "rw_ios_per_sec": 0, 00:13:38.991 "rw_mbytes_per_sec": 0, 00:13:38.991 "r_mbytes_per_sec": 0, 00:13:38.991 "w_mbytes_per_sec": 0 00:13:38.991 }, 00:13:38.991 "claimed": true, 00:13:38.991 "claim_type": "exclusive_write", 00:13:38.991 "zoned": false, 00:13:38.991 "supported_io_types": { 00:13:38.991 "read": true, 00:13:38.991 "write": true, 00:13:38.991 "unmap": true, 00:13:38.991 "write_zeroes": true, 00:13:38.991 "flush": true, 00:13:38.991 "reset": true, 00:13:38.991 "compare": false, 00:13:38.991 "compare_and_write": false, 00:13:38.991 "abort": true, 00:13:38.991 "nvme_admin": false, 00:13:38.991 "nvme_io": false 00:13:38.991 }, 00:13:38.991 "memory_domains": [ 00:13:38.991 { 00:13:38.991 "dma_device_id": "system", 00:13:38.991 "dma_device_type": 1 00:13:38.991 }, 00:13:38.991 { 00:13:38.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.991 "dma_device_type": 2 00:13:38.991 } 00:13:38.991 ], 00:13:38.991 "driver_specific": {} 00:13:38.991 }' 00:13:38.992 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.295 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.295 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:39.295 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.295 10:09:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.295 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:39.610 "name": "BaseBdev3", 00:13:39.610 "aliases": [ 00:13:39.610 "f6d76282-8fea-4cf2-81ca-08f0583a000e" 00:13:39.610 ], 00:13:39.610 "product_name": "Malloc disk", 00:13:39.610 "block_size": 512, 00:13:39.610 "num_blocks": 65536, 00:13:39.610 "uuid": "f6d76282-8fea-4cf2-81ca-08f0583a000e", 00:13:39.610 "assigned_rate_limits": { 00:13:39.610 "rw_ios_per_sec": 0, 00:13:39.610 "rw_mbytes_per_sec": 0, 00:13:39.610 "r_mbytes_per_sec": 0, 00:13:39.610 "w_mbytes_per_sec": 0 00:13:39.610 }, 00:13:39.610 "claimed": true, 00:13:39.610 "claim_type": "exclusive_write", 00:13:39.610 "zoned": false, 00:13:39.610 "supported_io_types": { 00:13:39.610 "read": true, 00:13:39.610 "write": true, 00:13:39.610 "unmap": true, 00:13:39.610 "write_zeroes": true, 00:13:39.610 "flush": true, 00:13:39.610 "reset": true, 00:13:39.610 "compare": false, 00:13:39.610 "compare_and_write": false, 00:13:39.610 "abort": true, 00:13:39.610 "nvme_admin": false, 00:13:39.610 "nvme_io": false 00:13:39.610 }, 00:13:39.610 "memory_domains": [ 00:13:39.610 { 00:13:39.610 "dma_device_id": "system", 00:13:39.610 "dma_device_type": 1 00:13:39.610 }, 00:13:39.610 { 00:13:39.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.610 "dma_device_type": 2 00:13:39.610 } 00:13:39.610 ], 00:13:39.610 "driver_specific": {} 00:13:39.610 }' 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.610 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.870 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:40.130 [2024-06-10 10:09:01.907345] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:40.130 [2024-06-10 10:09:01.907362] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.130 [2024-06-10 10:09:01.907392] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.130 10:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.390 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.390 "name": "Existed_Raid", 00:13:40.390 "uuid": "721a9f1b-5826-43a6-a69a-7a9304735c77", 00:13:40.390 "strip_size_kb": 64, 00:13:40.390 "state": "offline", 00:13:40.390 "raid_level": "concat", 00:13:40.390 "superblock": false, 00:13:40.390 "num_base_bdevs": 3, 00:13:40.390 "num_base_bdevs_discovered": 2, 00:13:40.390 "num_base_bdevs_operational": 2, 00:13:40.390 "base_bdevs_list": [ 00:13:40.390 { 00:13:40.390 "name": null, 00:13:40.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.391 "is_configured": false, 00:13:40.391 "data_offset": 0, 00:13:40.391 "data_size": 65536 00:13:40.391 }, 00:13:40.391 { 00:13:40.391 "name": "BaseBdev2", 00:13:40.391 "uuid": "193696c4-d5c9-4a42-aa31-df72bbf77680", 00:13:40.391 "is_configured": true, 00:13:40.391 "data_offset": 0, 00:13:40.391 "data_size": 65536 00:13:40.391 }, 00:13:40.391 { 00:13:40.391 "name": "BaseBdev3", 00:13:40.391 "uuid": "f6d76282-8fea-4cf2-81ca-08f0583a000e", 00:13:40.391 "is_configured": true, 00:13:40.391 "data_offset": 0, 00:13:40.391 "data_size": 65536 00:13:40.391 } 00:13:40.391 ] 00:13:40.391 }' 00:13:40.391 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.391 10:09:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.960 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:40.960 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:40.960 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.960 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:41.221 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:41.221 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:41.221 10:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:41.221 [2024-06-10 10:09:03.018161] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:41.221 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:41.221 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:41.221 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.221 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:41.482 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:41.482 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:41.482 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:41.743 [2024-06-10 10:09:03.420929] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:41.743 [2024-06-10 10:09:03.420956] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x99c2c0 name Existed_Raid, state offline 00:13:41.743 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:41.743 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:41.743 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.743 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:42.004 BaseBdev2 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:42.004 10:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.265 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:42.525 [ 00:13:42.525 { 00:13:42.525 "name": "BaseBdev2", 00:13:42.525 "aliases": [ 00:13:42.525 "4d01f1d2-3afa-42ca-8236-6dc5030c6a61" 00:13:42.525 ], 00:13:42.525 "product_name": "Malloc disk", 00:13:42.525 "block_size": 512, 00:13:42.525 "num_blocks": 65536, 00:13:42.525 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:42.525 "assigned_rate_limits": { 00:13:42.525 "rw_ios_per_sec": 0, 00:13:42.525 "rw_mbytes_per_sec": 0, 00:13:42.525 "r_mbytes_per_sec": 0, 00:13:42.525 "w_mbytes_per_sec": 0 00:13:42.525 }, 00:13:42.525 "claimed": false, 00:13:42.525 "zoned": false, 00:13:42.525 "supported_io_types": { 00:13:42.525 "read": true, 00:13:42.525 "write": true, 00:13:42.525 "unmap": true, 00:13:42.525 "write_zeroes": true, 00:13:42.525 "flush": true, 00:13:42.525 "reset": true, 00:13:42.525 "compare": false, 00:13:42.525 "compare_and_write": false, 00:13:42.525 "abort": true, 00:13:42.525 "nvme_admin": false, 00:13:42.525 "nvme_io": false 00:13:42.525 }, 00:13:42.525 "memory_domains": [ 00:13:42.525 { 00:13:42.525 "dma_device_id": "system", 00:13:42.525 "dma_device_type": 1 00:13:42.525 }, 00:13:42.525 { 00:13:42.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.525 "dma_device_type": 2 00:13:42.525 } 00:13:42.525 ], 00:13:42.525 "driver_specific": {} 00:13:42.525 } 00:13:42.525 ] 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:42.525 BaseBdev3 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:42.525 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.786 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:43.048 [ 00:13:43.048 { 00:13:43.048 "name": "BaseBdev3", 00:13:43.048 "aliases": [ 00:13:43.048 "91e83292-78fe-4065-8390-a4eccd68c1c7" 00:13:43.048 ], 00:13:43.048 "product_name": "Malloc disk", 00:13:43.048 "block_size": 512, 00:13:43.048 "num_blocks": 65536, 00:13:43.048 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:43.048 "assigned_rate_limits": { 00:13:43.048 "rw_ios_per_sec": 0, 00:13:43.048 "rw_mbytes_per_sec": 0, 00:13:43.048 "r_mbytes_per_sec": 0, 00:13:43.048 "w_mbytes_per_sec": 0 00:13:43.048 }, 00:13:43.048 "claimed": false, 00:13:43.048 "zoned": false, 00:13:43.048 "supported_io_types": { 00:13:43.048 "read": true, 00:13:43.048 "write": true, 00:13:43.048 "unmap": true, 00:13:43.048 "write_zeroes": true, 00:13:43.048 "flush": true, 00:13:43.048 "reset": true, 00:13:43.048 "compare": false, 00:13:43.048 "compare_and_write": false, 00:13:43.048 "abort": true, 00:13:43.048 "nvme_admin": false, 00:13:43.048 "nvme_io": false 00:13:43.048 }, 00:13:43.048 "memory_domains": [ 00:13:43.048 { 00:13:43.048 "dma_device_id": "system", 00:13:43.048 "dma_device_type": 1 00:13:43.048 }, 00:13:43.048 { 00:13:43.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.048 "dma_device_type": 2 00:13:43.048 } 00:13:43.048 ], 00:13:43.048 "driver_specific": {} 00:13:43.048 } 00:13:43.048 ] 00:13:43.048 10:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:43.048 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:43.048 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:43.048 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:43.309 [2024-06-10 10:09:04.928343] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:43.309 [2024-06-10 10:09:04.928376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:43.309 [2024-06-10 10:09:04.928388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:43.309 [2024-06-10 10:09:04.929409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.309 10:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.309 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.309 "name": "Existed_Raid", 00:13:43.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.309 "strip_size_kb": 64, 00:13:43.309 "state": "configuring", 00:13:43.309 "raid_level": "concat", 00:13:43.309 "superblock": false, 00:13:43.309 "num_base_bdevs": 3, 00:13:43.309 "num_base_bdevs_discovered": 2, 00:13:43.309 "num_base_bdevs_operational": 3, 00:13:43.309 "base_bdevs_list": [ 00:13:43.309 { 00:13:43.309 "name": "BaseBdev1", 00:13:43.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.309 "is_configured": false, 00:13:43.309 "data_offset": 0, 00:13:43.309 "data_size": 0 00:13:43.309 }, 00:13:43.309 { 00:13:43.309 "name": "BaseBdev2", 00:13:43.309 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:43.309 "is_configured": true, 00:13:43.309 "data_offset": 0, 00:13:43.309 "data_size": 65536 00:13:43.309 }, 00:13:43.309 { 00:13:43.309 "name": "BaseBdev3", 00:13:43.309 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:43.309 "is_configured": true, 00:13:43.309 "data_offset": 0, 00:13:43.309 "data_size": 65536 00:13:43.309 } 00:13:43.309 ] 00:13:43.309 }' 00:13:43.309 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.309 10:09:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.880 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:44.140 [2024-06-10 10:09:05.814566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.140 10:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.400 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.400 "name": "Existed_Raid", 00:13:44.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.401 "strip_size_kb": 64, 00:13:44.401 "state": "configuring", 00:13:44.401 "raid_level": "concat", 00:13:44.401 "superblock": false, 00:13:44.401 "num_base_bdevs": 3, 00:13:44.401 "num_base_bdevs_discovered": 1, 00:13:44.401 "num_base_bdevs_operational": 3, 00:13:44.401 "base_bdevs_list": [ 00:13:44.401 { 00:13:44.401 "name": "BaseBdev1", 00:13:44.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.401 "is_configured": false, 00:13:44.401 "data_offset": 0, 00:13:44.401 "data_size": 0 00:13:44.401 }, 00:13:44.401 { 00:13:44.401 "name": null, 00:13:44.401 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:44.401 "is_configured": false, 00:13:44.401 "data_offset": 0, 00:13:44.401 "data_size": 65536 00:13:44.401 }, 00:13:44.401 { 00:13:44.401 "name": "BaseBdev3", 00:13:44.401 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:44.401 "is_configured": true, 00:13:44.401 "data_offset": 0, 00:13:44.401 "data_size": 65536 00:13:44.401 } 00:13:44.401 ] 00:13:44.401 }' 00:13:44.401 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.401 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.972 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.972 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:44.972 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:44.972 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:45.231 [2024-06-10 10:09:06.910310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:45.231 BaseBdev1 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:45.231 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:45.232 10:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.232 10:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:45.492 [ 00:13:45.492 { 00:13:45.492 "name": "BaseBdev1", 00:13:45.492 "aliases": [ 00:13:45.492 "2d2fb6c0-5122-46ad-af99-b21d0731dc2b" 00:13:45.492 ], 00:13:45.492 "product_name": "Malloc disk", 00:13:45.492 "block_size": 512, 00:13:45.492 "num_blocks": 65536, 00:13:45.492 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:45.492 "assigned_rate_limits": { 00:13:45.492 "rw_ios_per_sec": 0, 00:13:45.492 "rw_mbytes_per_sec": 0, 00:13:45.492 "r_mbytes_per_sec": 0, 00:13:45.492 "w_mbytes_per_sec": 0 00:13:45.492 }, 00:13:45.492 "claimed": true, 00:13:45.492 "claim_type": "exclusive_write", 00:13:45.492 "zoned": false, 00:13:45.492 "supported_io_types": { 00:13:45.492 "read": true, 00:13:45.492 "write": true, 00:13:45.492 "unmap": true, 00:13:45.492 "write_zeroes": true, 00:13:45.492 "flush": true, 00:13:45.492 "reset": true, 00:13:45.492 "compare": false, 00:13:45.492 "compare_and_write": false, 00:13:45.492 "abort": true, 00:13:45.492 "nvme_admin": false, 00:13:45.492 "nvme_io": false 00:13:45.492 }, 00:13:45.492 "memory_domains": [ 00:13:45.492 { 00:13:45.492 "dma_device_id": "system", 00:13:45.492 "dma_device_type": 1 00:13:45.492 }, 00:13:45.492 { 00:13:45.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.492 "dma_device_type": 2 00:13:45.492 } 00:13:45.492 ], 00:13:45.492 "driver_specific": {} 00:13:45.492 } 00:13:45.492 ] 00:13:45.492 10:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.493 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.754 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.754 "name": "Existed_Raid", 00:13:45.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.754 "strip_size_kb": 64, 00:13:45.754 "state": "configuring", 00:13:45.754 "raid_level": "concat", 00:13:45.754 "superblock": false, 00:13:45.754 "num_base_bdevs": 3, 00:13:45.754 "num_base_bdevs_discovered": 2, 00:13:45.754 "num_base_bdevs_operational": 3, 00:13:45.754 "base_bdevs_list": [ 00:13:45.754 { 00:13:45.754 "name": "BaseBdev1", 00:13:45.754 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:45.754 "is_configured": true, 00:13:45.754 "data_offset": 0, 00:13:45.754 "data_size": 65536 00:13:45.754 }, 00:13:45.754 { 00:13:45.754 "name": null, 00:13:45.754 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:45.754 "is_configured": false, 00:13:45.754 "data_offset": 0, 00:13:45.754 "data_size": 65536 00:13:45.754 }, 00:13:45.754 { 00:13:45.754 "name": "BaseBdev3", 00:13:45.754 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:45.754 "is_configured": true, 00:13:45.754 "data_offset": 0, 00:13:45.754 "data_size": 65536 00:13:45.754 } 00:13:45.754 ] 00:13:45.754 }' 00:13:45.754 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.754 10:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.326 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.326 10:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:46.326 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:46.326 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:46.586 [2024-06-10 10:09:08.353990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.586 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.846 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.846 "name": "Existed_Raid", 00:13:46.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.846 "strip_size_kb": 64, 00:13:46.846 "state": "configuring", 00:13:46.846 "raid_level": "concat", 00:13:46.846 "superblock": false, 00:13:46.846 "num_base_bdevs": 3, 00:13:46.846 "num_base_bdevs_discovered": 1, 00:13:46.846 "num_base_bdevs_operational": 3, 00:13:46.846 "base_bdevs_list": [ 00:13:46.846 { 00:13:46.846 "name": "BaseBdev1", 00:13:46.846 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:46.846 "is_configured": true, 00:13:46.846 "data_offset": 0, 00:13:46.846 "data_size": 65536 00:13:46.846 }, 00:13:46.846 { 00:13:46.846 "name": null, 00:13:46.846 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:46.846 "is_configured": false, 00:13:46.846 "data_offset": 0, 00:13:46.846 "data_size": 65536 00:13:46.846 }, 00:13:46.846 { 00:13:46.846 "name": null, 00:13:46.846 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:46.846 "is_configured": false, 00:13:46.846 "data_offset": 0, 00:13:46.846 "data_size": 65536 00:13:46.846 } 00:13:46.846 ] 00:13:46.846 }' 00:13:46.846 10:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.846 10:09:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.416 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:47.416 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.416 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:47.416 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:47.676 [2024-06-10 10:09:09.372583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.676 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.936 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.936 "name": "Existed_Raid", 00:13:47.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.936 "strip_size_kb": 64, 00:13:47.936 "state": "configuring", 00:13:47.936 "raid_level": "concat", 00:13:47.936 "superblock": false, 00:13:47.936 "num_base_bdevs": 3, 00:13:47.936 "num_base_bdevs_discovered": 2, 00:13:47.936 "num_base_bdevs_operational": 3, 00:13:47.936 "base_bdevs_list": [ 00:13:47.936 { 00:13:47.936 "name": "BaseBdev1", 00:13:47.936 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:47.936 "is_configured": true, 00:13:47.936 "data_offset": 0, 00:13:47.936 "data_size": 65536 00:13:47.936 }, 00:13:47.936 { 00:13:47.936 "name": null, 00:13:47.936 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:47.936 "is_configured": false, 00:13:47.936 "data_offset": 0, 00:13:47.936 "data_size": 65536 00:13:47.936 }, 00:13:47.936 { 00:13:47.936 "name": "BaseBdev3", 00:13:47.936 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:47.936 "is_configured": true, 00:13:47.936 "data_offset": 0, 00:13:47.936 "data_size": 65536 00:13:47.936 } 00:13:47.936 ] 00:13:47.936 }' 00:13:47.936 10:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.936 10:09:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.196 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.196 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:48.456 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:48.456 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:48.716 [2024-06-10 10:09:10.407214] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.716 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.976 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.976 "name": "Existed_Raid", 00:13:48.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.976 "strip_size_kb": 64, 00:13:48.976 "state": "configuring", 00:13:48.976 "raid_level": "concat", 00:13:48.976 "superblock": false, 00:13:48.976 "num_base_bdevs": 3, 00:13:48.976 "num_base_bdevs_discovered": 1, 00:13:48.976 "num_base_bdevs_operational": 3, 00:13:48.976 "base_bdevs_list": [ 00:13:48.976 { 00:13:48.976 "name": null, 00:13:48.976 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:48.976 "is_configured": false, 00:13:48.976 "data_offset": 0, 00:13:48.976 "data_size": 65536 00:13:48.976 }, 00:13:48.976 { 00:13:48.976 "name": null, 00:13:48.976 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:48.976 "is_configured": false, 00:13:48.976 "data_offset": 0, 00:13:48.976 "data_size": 65536 00:13:48.976 }, 00:13:48.976 { 00:13:48.976 "name": "BaseBdev3", 00:13:48.976 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:48.976 "is_configured": true, 00:13:48.976 "data_offset": 0, 00:13:48.976 "data_size": 65536 00:13:48.976 } 00:13:48.976 ] 00:13:48.976 }' 00:13:48.976 10:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.976 10:09:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.545 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.545 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:49.545 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:49.545 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:49.804 [2024-06-10 10:09:11.507818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.804 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.063 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.063 "name": "Existed_Raid", 00:13:50.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.063 "strip_size_kb": 64, 00:13:50.063 "state": "configuring", 00:13:50.063 "raid_level": "concat", 00:13:50.063 "superblock": false, 00:13:50.063 "num_base_bdevs": 3, 00:13:50.063 "num_base_bdevs_discovered": 2, 00:13:50.064 "num_base_bdevs_operational": 3, 00:13:50.064 "base_bdevs_list": [ 00:13:50.064 { 00:13:50.064 "name": null, 00:13:50.064 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:50.064 "is_configured": false, 00:13:50.064 "data_offset": 0, 00:13:50.064 "data_size": 65536 00:13:50.064 }, 00:13:50.064 { 00:13:50.064 "name": "BaseBdev2", 00:13:50.064 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:50.064 "is_configured": true, 00:13:50.064 "data_offset": 0, 00:13:50.064 "data_size": 65536 00:13:50.064 }, 00:13:50.064 { 00:13:50.064 "name": "BaseBdev3", 00:13:50.064 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:50.064 "is_configured": true, 00:13:50.064 "data_offset": 0, 00:13:50.064 "data_size": 65536 00:13:50.064 } 00:13:50.064 ] 00:13:50.064 }' 00:13:50.064 10:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.064 10:09:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.633 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:50.633 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.633 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:50.633 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.633 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:50.893 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2d2fb6c0-5122-46ad-af99-b21d0731dc2b 00:13:51.152 [2024-06-10 10:09:12.783969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:51.152 [2024-06-10 10:09:12.783993] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb41640 00:13:51.152 [2024-06-10 10:09:12.783997] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:51.152 [2024-06-10 10:09:12.784151] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb42250 00:13:51.152 [2024-06-10 10:09:12.784238] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb41640 00:13:51.152 [2024-06-10 10:09:12.784243] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb41640 00:13:51.152 [2024-06-10 10:09:12.784361] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:51.152 NewBaseBdev 00:13:51.152 10:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:51.152 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:13:51.152 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:51.152 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:51.153 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:51.153 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:51.153 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:51.153 10:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:51.412 [ 00:13:51.412 { 00:13:51.413 "name": "NewBaseBdev", 00:13:51.413 "aliases": [ 00:13:51.413 "2d2fb6c0-5122-46ad-af99-b21d0731dc2b" 00:13:51.413 ], 00:13:51.413 "product_name": "Malloc disk", 00:13:51.413 "block_size": 512, 00:13:51.413 "num_blocks": 65536, 00:13:51.413 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:51.413 "assigned_rate_limits": { 00:13:51.413 "rw_ios_per_sec": 0, 00:13:51.413 "rw_mbytes_per_sec": 0, 00:13:51.413 "r_mbytes_per_sec": 0, 00:13:51.413 "w_mbytes_per_sec": 0 00:13:51.413 }, 00:13:51.413 "claimed": true, 00:13:51.413 "claim_type": "exclusive_write", 00:13:51.413 "zoned": false, 00:13:51.413 "supported_io_types": { 00:13:51.413 "read": true, 00:13:51.413 "write": true, 00:13:51.413 "unmap": true, 00:13:51.413 "write_zeroes": true, 00:13:51.413 "flush": true, 00:13:51.413 "reset": true, 00:13:51.413 "compare": false, 00:13:51.413 "compare_and_write": false, 00:13:51.413 "abort": true, 00:13:51.413 "nvme_admin": false, 00:13:51.413 "nvme_io": false 00:13:51.413 }, 00:13:51.413 "memory_domains": [ 00:13:51.413 { 00:13:51.413 "dma_device_id": "system", 00:13:51.413 "dma_device_type": 1 00:13:51.413 }, 00:13:51.413 { 00:13:51.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.413 "dma_device_type": 2 00:13:51.413 } 00:13:51.413 ], 00:13:51.413 "driver_specific": {} 00:13:51.413 } 00:13:51.413 ] 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.413 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.672 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.672 "name": "Existed_Raid", 00:13:51.672 "uuid": "0e1e754a-c7ee-4219-8b61-768bb11f675c", 00:13:51.672 "strip_size_kb": 64, 00:13:51.672 "state": "online", 00:13:51.672 "raid_level": "concat", 00:13:51.672 "superblock": false, 00:13:51.672 "num_base_bdevs": 3, 00:13:51.672 "num_base_bdevs_discovered": 3, 00:13:51.672 "num_base_bdevs_operational": 3, 00:13:51.672 "base_bdevs_list": [ 00:13:51.672 { 00:13:51.672 "name": "NewBaseBdev", 00:13:51.672 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:51.672 "is_configured": true, 00:13:51.672 "data_offset": 0, 00:13:51.672 "data_size": 65536 00:13:51.672 }, 00:13:51.672 { 00:13:51.672 "name": "BaseBdev2", 00:13:51.672 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:51.672 "is_configured": true, 00:13:51.672 "data_offset": 0, 00:13:51.672 "data_size": 65536 00:13:51.672 }, 00:13:51.672 { 00:13:51.672 "name": "BaseBdev3", 00:13:51.672 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:51.672 "is_configured": true, 00:13:51.672 "data_offset": 0, 00:13:51.672 "data_size": 65536 00:13:51.672 } 00:13:51.672 ] 00:13:51.672 }' 00:13:51.672 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.672 10:09:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:51.932 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:52.193 [2024-06-10 10:09:13.907058] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:52.193 "name": "Existed_Raid", 00:13:52.193 "aliases": [ 00:13:52.193 "0e1e754a-c7ee-4219-8b61-768bb11f675c" 00:13:52.193 ], 00:13:52.193 "product_name": "Raid Volume", 00:13:52.193 "block_size": 512, 00:13:52.193 "num_blocks": 196608, 00:13:52.193 "uuid": "0e1e754a-c7ee-4219-8b61-768bb11f675c", 00:13:52.193 "assigned_rate_limits": { 00:13:52.193 "rw_ios_per_sec": 0, 00:13:52.193 "rw_mbytes_per_sec": 0, 00:13:52.193 "r_mbytes_per_sec": 0, 00:13:52.193 "w_mbytes_per_sec": 0 00:13:52.193 }, 00:13:52.193 "claimed": false, 00:13:52.193 "zoned": false, 00:13:52.193 "supported_io_types": { 00:13:52.193 "read": true, 00:13:52.193 "write": true, 00:13:52.193 "unmap": true, 00:13:52.193 "write_zeroes": true, 00:13:52.193 "flush": true, 00:13:52.193 "reset": true, 00:13:52.193 "compare": false, 00:13:52.193 "compare_and_write": false, 00:13:52.193 "abort": false, 00:13:52.193 "nvme_admin": false, 00:13:52.193 "nvme_io": false 00:13:52.193 }, 00:13:52.193 "memory_domains": [ 00:13:52.193 { 00:13:52.193 "dma_device_id": "system", 00:13:52.193 "dma_device_type": 1 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.193 "dma_device_type": 2 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "dma_device_id": "system", 00:13:52.193 "dma_device_type": 1 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.193 "dma_device_type": 2 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "dma_device_id": "system", 00:13:52.193 "dma_device_type": 1 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.193 "dma_device_type": 2 00:13:52.193 } 00:13:52.193 ], 00:13:52.193 "driver_specific": { 00:13:52.193 "raid": { 00:13:52.193 "uuid": "0e1e754a-c7ee-4219-8b61-768bb11f675c", 00:13:52.193 "strip_size_kb": 64, 00:13:52.193 "state": "online", 00:13:52.193 "raid_level": "concat", 00:13:52.193 "superblock": false, 00:13:52.193 "num_base_bdevs": 3, 00:13:52.193 "num_base_bdevs_discovered": 3, 00:13:52.193 "num_base_bdevs_operational": 3, 00:13:52.193 "base_bdevs_list": [ 00:13:52.193 { 00:13:52.193 "name": "NewBaseBdev", 00:13:52.193 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:52.193 "is_configured": true, 00:13:52.193 "data_offset": 0, 00:13:52.193 "data_size": 65536 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "name": "BaseBdev2", 00:13:52.193 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:52.193 "is_configured": true, 00:13:52.193 "data_offset": 0, 00:13:52.193 "data_size": 65536 00:13:52.193 }, 00:13:52.193 { 00:13:52.193 "name": "BaseBdev3", 00:13:52.193 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:52.193 "is_configured": true, 00:13:52.193 "data_offset": 0, 00:13:52.193 "data_size": 65536 00:13:52.193 } 00:13:52.193 ] 00:13:52.193 } 00:13:52.193 } 00:13:52.193 }' 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:52.193 BaseBdev2 00:13:52.193 BaseBdev3' 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:52.193 10:09:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.453 "name": "NewBaseBdev", 00:13:52.453 "aliases": [ 00:13:52.453 "2d2fb6c0-5122-46ad-af99-b21d0731dc2b" 00:13:52.453 ], 00:13:52.453 "product_name": "Malloc disk", 00:13:52.453 "block_size": 512, 00:13:52.453 "num_blocks": 65536, 00:13:52.453 "uuid": "2d2fb6c0-5122-46ad-af99-b21d0731dc2b", 00:13:52.453 "assigned_rate_limits": { 00:13:52.453 "rw_ios_per_sec": 0, 00:13:52.453 "rw_mbytes_per_sec": 0, 00:13:52.453 "r_mbytes_per_sec": 0, 00:13:52.453 "w_mbytes_per_sec": 0 00:13:52.453 }, 00:13:52.453 "claimed": true, 00:13:52.453 "claim_type": "exclusive_write", 00:13:52.453 "zoned": false, 00:13:52.453 "supported_io_types": { 00:13:52.453 "read": true, 00:13:52.453 "write": true, 00:13:52.453 "unmap": true, 00:13:52.453 "write_zeroes": true, 00:13:52.453 "flush": true, 00:13:52.453 "reset": true, 00:13:52.453 "compare": false, 00:13:52.453 "compare_and_write": false, 00:13:52.453 "abort": true, 00:13:52.453 "nvme_admin": false, 00:13:52.453 "nvme_io": false 00:13:52.453 }, 00:13:52.453 "memory_domains": [ 00:13:52.453 { 00:13:52.453 "dma_device_id": "system", 00:13:52.453 "dma_device_type": 1 00:13:52.453 }, 00:13:52.453 { 00:13:52.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.453 "dma_device_type": 2 00:13:52.453 } 00:13:52.453 ], 00:13:52.453 "driver_specific": {} 00:13:52.453 }' 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.453 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:52.714 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.974 "name": "BaseBdev2", 00:13:52.974 "aliases": [ 00:13:52.974 "4d01f1d2-3afa-42ca-8236-6dc5030c6a61" 00:13:52.974 ], 00:13:52.974 "product_name": "Malloc disk", 00:13:52.974 "block_size": 512, 00:13:52.974 "num_blocks": 65536, 00:13:52.974 "uuid": "4d01f1d2-3afa-42ca-8236-6dc5030c6a61", 00:13:52.974 "assigned_rate_limits": { 00:13:52.974 "rw_ios_per_sec": 0, 00:13:52.974 "rw_mbytes_per_sec": 0, 00:13:52.974 "r_mbytes_per_sec": 0, 00:13:52.974 "w_mbytes_per_sec": 0 00:13:52.974 }, 00:13:52.974 "claimed": true, 00:13:52.974 "claim_type": "exclusive_write", 00:13:52.974 "zoned": false, 00:13:52.974 "supported_io_types": { 00:13:52.974 "read": true, 00:13:52.974 "write": true, 00:13:52.974 "unmap": true, 00:13:52.974 "write_zeroes": true, 00:13:52.974 "flush": true, 00:13:52.974 "reset": true, 00:13:52.974 "compare": false, 00:13:52.974 "compare_and_write": false, 00:13:52.974 "abort": true, 00:13:52.974 "nvme_admin": false, 00:13:52.974 "nvme_io": false 00:13:52.974 }, 00:13:52.974 "memory_domains": [ 00:13:52.974 { 00:13:52.974 "dma_device_id": "system", 00:13:52.974 "dma_device_type": 1 00:13:52.974 }, 00:13:52.974 { 00:13:52.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.974 "dma_device_type": 2 00:13:52.974 } 00:13:52.974 ], 00:13:52.974 "driver_specific": {} 00:13:52.974 }' 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.974 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:53.233 10:09:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:53.493 "name": "BaseBdev3", 00:13:53.493 "aliases": [ 00:13:53.493 "91e83292-78fe-4065-8390-a4eccd68c1c7" 00:13:53.493 ], 00:13:53.493 "product_name": "Malloc disk", 00:13:53.493 "block_size": 512, 00:13:53.493 "num_blocks": 65536, 00:13:53.493 "uuid": "91e83292-78fe-4065-8390-a4eccd68c1c7", 00:13:53.493 "assigned_rate_limits": { 00:13:53.493 "rw_ios_per_sec": 0, 00:13:53.493 "rw_mbytes_per_sec": 0, 00:13:53.493 "r_mbytes_per_sec": 0, 00:13:53.493 "w_mbytes_per_sec": 0 00:13:53.493 }, 00:13:53.493 "claimed": true, 00:13:53.493 "claim_type": "exclusive_write", 00:13:53.493 "zoned": false, 00:13:53.493 "supported_io_types": { 00:13:53.493 "read": true, 00:13:53.493 "write": true, 00:13:53.493 "unmap": true, 00:13:53.493 "write_zeroes": true, 00:13:53.493 "flush": true, 00:13:53.493 "reset": true, 00:13:53.493 "compare": false, 00:13:53.493 "compare_and_write": false, 00:13:53.493 "abort": true, 00:13:53.493 "nvme_admin": false, 00:13:53.493 "nvme_io": false 00:13:53.493 }, 00:13:53.493 "memory_domains": [ 00:13:53.493 { 00:13:53.493 "dma_device_id": "system", 00:13:53.493 "dma_device_type": 1 00:13:53.493 }, 00:13:53.493 { 00:13:53.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.493 "dma_device_type": 2 00:13:53.493 } 00:13:53.493 ], 00:13:53.493 "driver_specific": {} 00:13:53.493 }' 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.493 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.753 [2024-06-10 10:09:15.534983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.753 [2024-06-10 10:09:15.534999] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:53.753 [2024-06-10 10:09:15.535039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:53.753 [2024-06-10 10:09:15.535077] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:53.753 [2024-06-10 10:09:15.535082] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb41640 name Existed_Raid, state offline 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 992536 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 992536 ']' 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 992536 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 992536 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 992536' 00:13:53.753 killing process with pid 992536 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 992536 00:13:53.753 [2024-06-10 10:09:15.606233] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:53.753 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 992536 00:13:54.013 [2024-06-10 10:09:15.620856] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:54.013 00:13:54.013 real 0m22.940s 00:13:54.013 user 0m42.896s 00:13:54.013 sys 0m3.358s 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.013 ************************************ 00:13:54.013 END TEST raid_state_function_test 00:13:54.013 ************************************ 00:13:54.013 10:09:15 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:13:54.013 10:09:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:54.013 10:09:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:54.013 10:09:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:54.013 ************************************ 00:13:54.013 START TEST raid_state_function_test_sb 00:13:54.013 ************************************ 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 true 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=997459 00:13:54.013 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 997459' 00:13:54.014 Process raid pid: 997459 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 997459 /var/tmp/spdk-raid.sock 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 997459 ']' 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:54.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:54.014 10:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:54.274 [2024-06-10 10:09:15.886745] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:13:54.274 [2024-06-10 10:09:15.886793] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:54.274 [2024-06-10 10:09:15.974165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:54.274 [2024-06-10 10:09:16.036485] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:54.274 [2024-06-10 10:09:16.081643] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:54.274 [2024-06-10 10:09:16.081666] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:54.844 10:09:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:54.844 10:09:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:13:54.844 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:55.104 [2024-06-10 10:09:16.868976] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:55.104 [2024-06-10 10:09:16.869007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:55.104 [2024-06-10 10:09:16.869014] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:55.104 [2024-06-10 10:09:16.869023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:55.104 [2024-06-10 10:09:16.869027] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:55.104 [2024-06-10 10:09:16.869033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.104 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.105 10:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.365 10:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.365 "name": "Existed_Raid", 00:13:55.365 "uuid": "4a6267ab-7991-4e1a-8529-5e7dcc96d2bf", 00:13:55.365 "strip_size_kb": 64, 00:13:55.365 "state": "configuring", 00:13:55.365 "raid_level": "concat", 00:13:55.365 "superblock": true, 00:13:55.365 "num_base_bdevs": 3, 00:13:55.365 "num_base_bdevs_discovered": 0, 00:13:55.365 "num_base_bdevs_operational": 3, 00:13:55.365 "base_bdevs_list": [ 00:13:55.365 { 00:13:55.365 "name": "BaseBdev1", 00:13:55.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.365 "is_configured": false, 00:13:55.365 "data_offset": 0, 00:13:55.365 "data_size": 0 00:13:55.365 }, 00:13:55.365 { 00:13:55.365 "name": "BaseBdev2", 00:13:55.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.365 "is_configured": false, 00:13:55.365 "data_offset": 0, 00:13:55.365 "data_size": 0 00:13:55.365 }, 00:13:55.365 { 00:13:55.365 "name": "BaseBdev3", 00:13:55.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.365 "is_configured": false, 00:13:55.365 "data_offset": 0, 00:13:55.365 "data_size": 0 00:13:55.365 } 00:13:55.365 ] 00:13:55.365 }' 00:13:55.365 10:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.365 10:09:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.934 10:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:56.195 [2024-06-10 10:09:17.803210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:56.195 [2024-06-10 10:09:17.803227] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabbb00 name Existed_Raid, state configuring 00:13:56.195 10:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.195 [2024-06-10 10:09:17.995724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:56.195 [2024-06-10 10:09:17.995743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:56.195 [2024-06-10 10:09:17.995749] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:56.195 [2024-06-10 10:09:17.995755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:56.195 [2024-06-10 10:09:17.995759] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:56.195 [2024-06-10 10:09:17.995772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:56.195 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:56.456 [2024-06-10 10:09:18.182792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:56.456 BaseBdev1 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:56.456 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:56.716 [ 00:13:56.716 { 00:13:56.716 "name": "BaseBdev1", 00:13:56.716 "aliases": [ 00:13:56.716 "54ed16a8-c63a-47b8-b44c-a8e188e2fba9" 00:13:56.716 ], 00:13:56.716 "product_name": "Malloc disk", 00:13:56.716 "block_size": 512, 00:13:56.716 "num_blocks": 65536, 00:13:56.716 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:13:56.716 "assigned_rate_limits": { 00:13:56.716 "rw_ios_per_sec": 0, 00:13:56.716 "rw_mbytes_per_sec": 0, 00:13:56.716 "r_mbytes_per_sec": 0, 00:13:56.716 "w_mbytes_per_sec": 0 00:13:56.716 }, 00:13:56.716 "claimed": true, 00:13:56.716 "claim_type": "exclusive_write", 00:13:56.716 "zoned": false, 00:13:56.716 "supported_io_types": { 00:13:56.716 "read": true, 00:13:56.716 "write": true, 00:13:56.716 "unmap": true, 00:13:56.716 "write_zeroes": true, 00:13:56.716 "flush": true, 00:13:56.716 "reset": true, 00:13:56.716 "compare": false, 00:13:56.716 "compare_and_write": false, 00:13:56.716 "abort": true, 00:13:56.716 "nvme_admin": false, 00:13:56.716 "nvme_io": false 00:13:56.716 }, 00:13:56.716 "memory_domains": [ 00:13:56.716 { 00:13:56.716 "dma_device_id": "system", 00:13:56.716 "dma_device_type": 1 00:13:56.716 }, 00:13:56.716 { 00:13:56.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.716 "dma_device_type": 2 00:13:56.716 } 00:13:56.716 ], 00:13:56.716 "driver_specific": {} 00:13:56.716 } 00:13:56.716 ] 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.716 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.977 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.977 "name": "Existed_Raid", 00:13:56.977 "uuid": "ce586d01-bfe8-41d5-a8e2-3e8140537047", 00:13:56.977 "strip_size_kb": 64, 00:13:56.977 "state": "configuring", 00:13:56.977 "raid_level": "concat", 00:13:56.977 "superblock": true, 00:13:56.977 "num_base_bdevs": 3, 00:13:56.977 "num_base_bdevs_discovered": 1, 00:13:56.977 "num_base_bdevs_operational": 3, 00:13:56.977 "base_bdevs_list": [ 00:13:56.977 { 00:13:56.977 "name": "BaseBdev1", 00:13:56.977 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:13:56.977 "is_configured": true, 00:13:56.977 "data_offset": 2048, 00:13:56.977 "data_size": 63488 00:13:56.977 }, 00:13:56.977 { 00:13:56.977 "name": "BaseBdev2", 00:13:56.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.977 "is_configured": false, 00:13:56.977 "data_offset": 0, 00:13:56.977 "data_size": 0 00:13:56.977 }, 00:13:56.977 { 00:13:56.977 "name": "BaseBdev3", 00:13:56.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.977 "is_configured": false, 00:13:56.977 "data_offset": 0, 00:13:56.977 "data_size": 0 00:13:56.977 } 00:13:56.977 ] 00:13:56.977 }' 00:13:56.977 10:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.977 10:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.547 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:57.810 [2024-06-10 10:09:19.417922] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:57.810 [2024-06-10 10:09:19.417947] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabb3f0 name Existed_Raid, state configuring 00:13:57.810 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:57.810 [2024-06-10 10:09:19.610441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:57.810 [2024-06-10 10:09:19.611569] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:57.810 [2024-06-10 10:09:19.611592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:57.810 [2024-06-10 10:09:19.611598] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:57.810 [2024-06-10 10:09:19.611603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:57.810 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:57.810 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:57.810 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:57.810 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.811 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.108 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.108 "name": "Existed_Raid", 00:13:58.108 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:13:58.108 "strip_size_kb": 64, 00:13:58.108 "state": "configuring", 00:13:58.108 "raid_level": "concat", 00:13:58.108 "superblock": true, 00:13:58.108 "num_base_bdevs": 3, 00:13:58.108 "num_base_bdevs_discovered": 1, 00:13:58.108 "num_base_bdevs_operational": 3, 00:13:58.108 "base_bdevs_list": [ 00:13:58.108 { 00:13:58.108 "name": "BaseBdev1", 00:13:58.108 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:13:58.108 "is_configured": true, 00:13:58.108 "data_offset": 2048, 00:13:58.108 "data_size": 63488 00:13:58.108 }, 00:13:58.108 { 00:13:58.108 "name": "BaseBdev2", 00:13:58.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.108 "is_configured": false, 00:13:58.108 "data_offset": 0, 00:13:58.108 "data_size": 0 00:13:58.108 }, 00:13:58.108 { 00:13:58.108 "name": "BaseBdev3", 00:13:58.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.108 "is_configured": false, 00:13:58.108 "data_offset": 0, 00:13:58.108 "data_size": 0 00:13:58.108 } 00:13:58.108 ] 00:13:58.108 }' 00:13:58.108 10:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.108 10:09:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:58.680 [2024-06-10 10:09:20.413238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.680 BaseBdev2 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:58.680 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:58.940 [ 00:13:58.940 { 00:13:58.940 "name": "BaseBdev2", 00:13:58.940 "aliases": [ 00:13:58.940 "a64df65a-029a-4e20-b0d0-c9621cd6bfae" 00:13:58.940 ], 00:13:58.940 "product_name": "Malloc disk", 00:13:58.940 "block_size": 512, 00:13:58.940 "num_blocks": 65536, 00:13:58.940 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:13:58.940 "assigned_rate_limits": { 00:13:58.940 "rw_ios_per_sec": 0, 00:13:58.940 "rw_mbytes_per_sec": 0, 00:13:58.940 "r_mbytes_per_sec": 0, 00:13:58.940 "w_mbytes_per_sec": 0 00:13:58.940 }, 00:13:58.940 "claimed": true, 00:13:58.940 "claim_type": "exclusive_write", 00:13:58.940 "zoned": false, 00:13:58.940 "supported_io_types": { 00:13:58.940 "read": true, 00:13:58.940 "write": true, 00:13:58.940 "unmap": true, 00:13:58.940 "write_zeroes": true, 00:13:58.940 "flush": true, 00:13:58.940 "reset": true, 00:13:58.940 "compare": false, 00:13:58.940 "compare_and_write": false, 00:13:58.940 "abort": true, 00:13:58.940 "nvme_admin": false, 00:13:58.940 "nvme_io": false 00:13:58.940 }, 00:13:58.940 "memory_domains": [ 00:13:58.940 { 00:13:58.940 "dma_device_id": "system", 00:13:58.940 "dma_device_type": 1 00:13:58.940 }, 00:13:58.940 { 00:13:58.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.940 "dma_device_type": 2 00:13:58.940 } 00:13:58.940 ], 00:13:58.940 "driver_specific": {} 00:13:58.940 } 00:13:58.940 ] 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.940 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.200 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.200 "name": "Existed_Raid", 00:13:59.200 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:13:59.200 "strip_size_kb": 64, 00:13:59.200 "state": "configuring", 00:13:59.200 "raid_level": "concat", 00:13:59.200 "superblock": true, 00:13:59.200 "num_base_bdevs": 3, 00:13:59.200 "num_base_bdevs_discovered": 2, 00:13:59.200 "num_base_bdevs_operational": 3, 00:13:59.200 "base_bdevs_list": [ 00:13:59.200 { 00:13:59.200 "name": "BaseBdev1", 00:13:59.200 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:13:59.200 "is_configured": true, 00:13:59.200 "data_offset": 2048, 00:13:59.200 "data_size": 63488 00:13:59.200 }, 00:13:59.200 { 00:13:59.200 "name": "BaseBdev2", 00:13:59.200 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:13:59.200 "is_configured": true, 00:13:59.200 "data_offset": 2048, 00:13:59.200 "data_size": 63488 00:13:59.200 }, 00:13:59.200 { 00:13:59.200 "name": "BaseBdev3", 00:13:59.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.200 "is_configured": false, 00:13:59.200 "data_offset": 0, 00:13:59.200 "data_size": 0 00:13:59.200 } 00:13:59.200 ] 00:13:59.200 }' 00:13:59.200 10:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.200 10:09:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.770 10:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:00.031 [2024-06-10 10:09:21.649423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:00.031 [2024-06-10 10:09:21.649540] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xabc2c0 00:14:00.031 [2024-06-10 10:09:21.649548] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:00.031 [2024-06-10 10:09:21.649682] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc5f970 00:14:00.031 [2024-06-10 10:09:21.649768] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xabc2c0 00:14:00.031 [2024-06-10 10:09:21.649774] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xabc2c0 00:14:00.031 [2024-06-10 10:09:21.649849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:00.031 BaseBdev3 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.031 10:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:00.290 [ 00:14:00.290 { 00:14:00.291 "name": "BaseBdev3", 00:14:00.291 "aliases": [ 00:14:00.291 "b6165a75-054a-4eb2-927c-c18350eeaa45" 00:14:00.291 ], 00:14:00.291 "product_name": "Malloc disk", 00:14:00.291 "block_size": 512, 00:14:00.291 "num_blocks": 65536, 00:14:00.291 "uuid": "b6165a75-054a-4eb2-927c-c18350eeaa45", 00:14:00.291 "assigned_rate_limits": { 00:14:00.291 "rw_ios_per_sec": 0, 00:14:00.291 "rw_mbytes_per_sec": 0, 00:14:00.291 "r_mbytes_per_sec": 0, 00:14:00.291 "w_mbytes_per_sec": 0 00:14:00.291 }, 00:14:00.291 "claimed": true, 00:14:00.291 "claim_type": "exclusive_write", 00:14:00.291 "zoned": false, 00:14:00.291 "supported_io_types": { 00:14:00.291 "read": true, 00:14:00.291 "write": true, 00:14:00.291 "unmap": true, 00:14:00.291 "write_zeroes": true, 00:14:00.291 "flush": true, 00:14:00.291 "reset": true, 00:14:00.291 "compare": false, 00:14:00.291 "compare_and_write": false, 00:14:00.291 "abort": true, 00:14:00.291 "nvme_admin": false, 00:14:00.291 "nvme_io": false 00:14:00.291 }, 00:14:00.291 "memory_domains": [ 00:14:00.291 { 00:14:00.291 "dma_device_id": "system", 00:14:00.291 "dma_device_type": 1 00:14:00.291 }, 00:14:00.291 { 00:14:00.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.291 "dma_device_type": 2 00:14:00.291 } 00:14:00.291 ], 00:14:00.291 "driver_specific": {} 00:14:00.291 } 00:14:00.291 ] 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.291 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.551 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.551 "name": "Existed_Raid", 00:14:00.551 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:14:00.551 "strip_size_kb": 64, 00:14:00.551 "state": "online", 00:14:00.551 "raid_level": "concat", 00:14:00.551 "superblock": true, 00:14:00.551 "num_base_bdevs": 3, 00:14:00.551 "num_base_bdevs_discovered": 3, 00:14:00.551 "num_base_bdevs_operational": 3, 00:14:00.551 "base_bdevs_list": [ 00:14:00.551 { 00:14:00.551 "name": "BaseBdev1", 00:14:00.551 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:14:00.551 "is_configured": true, 00:14:00.551 "data_offset": 2048, 00:14:00.551 "data_size": 63488 00:14:00.551 }, 00:14:00.551 { 00:14:00.551 "name": "BaseBdev2", 00:14:00.551 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:14:00.551 "is_configured": true, 00:14:00.551 "data_offset": 2048, 00:14:00.551 "data_size": 63488 00:14:00.551 }, 00:14:00.551 { 00:14:00.551 "name": "BaseBdev3", 00:14:00.551 "uuid": "b6165a75-054a-4eb2-927c-c18350eeaa45", 00:14:00.551 "is_configured": true, 00:14:00.551 "data_offset": 2048, 00:14:00.551 "data_size": 63488 00:14:00.551 } 00:14:00.551 ] 00:14:00.551 }' 00:14:00.551 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.551 10:09:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:01.121 [2024-06-10 10:09:22.960945] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:01.121 "name": "Existed_Raid", 00:14:01.121 "aliases": [ 00:14:01.121 "af7ac639-c511-4cd1-80d2-131b4f42f5d0" 00:14:01.121 ], 00:14:01.121 "product_name": "Raid Volume", 00:14:01.121 "block_size": 512, 00:14:01.121 "num_blocks": 190464, 00:14:01.121 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:14:01.121 "assigned_rate_limits": { 00:14:01.121 "rw_ios_per_sec": 0, 00:14:01.121 "rw_mbytes_per_sec": 0, 00:14:01.121 "r_mbytes_per_sec": 0, 00:14:01.121 "w_mbytes_per_sec": 0 00:14:01.121 }, 00:14:01.121 "claimed": false, 00:14:01.121 "zoned": false, 00:14:01.121 "supported_io_types": { 00:14:01.121 "read": true, 00:14:01.121 "write": true, 00:14:01.121 "unmap": true, 00:14:01.121 "write_zeroes": true, 00:14:01.121 "flush": true, 00:14:01.121 "reset": true, 00:14:01.121 "compare": false, 00:14:01.121 "compare_and_write": false, 00:14:01.121 "abort": false, 00:14:01.121 "nvme_admin": false, 00:14:01.121 "nvme_io": false 00:14:01.121 }, 00:14:01.121 "memory_domains": [ 00:14:01.121 { 00:14:01.121 "dma_device_id": "system", 00:14:01.121 "dma_device_type": 1 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.121 "dma_device_type": 2 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "dma_device_id": "system", 00:14:01.121 "dma_device_type": 1 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.121 "dma_device_type": 2 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "dma_device_id": "system", 00:14:01.121 "dma_device_type": 1 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.121 "dma_device_type": 2 00:14:01.121 } 00:14:01.121 ], 00:14:01.121 "driver_specific": { 00:14:01.121 "raid": { 00:14:01.121 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:14:01.121 "strip_size_kb": 64, 00:14:01.121 "state": "online", 00:14:01.121 "raid_level": "concat", 00:14:01.121 "superblock": true, 00:14:01.121 "num_base_bdevs": 3, 00:14:01.121 "num_base_bdevs_discovered": 3, 00:14:01.121 "num_base_bdevs_operational": 3, 00:14:01.121 "base_bdevs_list": [ 00:14:01.121 { 00:14:01.121 "name": "BaseBdev1", 00:14:01.121 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:14:01.121 "is_configured": true, 00:14:01.121 "data_offset": 2048, 00:14:01.121 "data_size": 63488 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "name": "BaseBdev2", 00:14:01.121 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:14:01.121 "is_configured": true, 00:14:01.121 "data_offset": 2048, 00:14:01.121 "data_size": 63488 00:14:01.121 }, 00:14:01.121 { 00:14:01.121 "name": "BaseBdev3", 00:14:01.121 "uuid": "b6165a75-054a-4eb2-927c-c18350eeaa45", 00:14:01.121 "is_configured": true, 00:14:01.121 "data_offset": 2048, 00:14:01.121 "data_size": 63488 00:14:01.121 } 00:14:01.121 ] 00:14:01.121 } 00:14:01.121 } 00:14:01.121 }' 00:14:01.121 10:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:01.382 BaseBdev2 00:14:01.382 BaseBdev3' 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:01.382 "name": "BaseBdev1", 00:14:01.382 "aliases": [ 00:14:01.382 "54ed16a8-c63a-47b8-b44c-a8e188e2fba9" 00:14:01.382 ], 00:14:01.382 "product_name": "Malloc disk", 00:14:01.382 "block_size": 512, 00:14:01.382 "num_blocks": 65536, 00:14:01.382 "uuid": "54ed16a8-c63a-47b8-b44c-a8e188e2fba9", 00:14:01.382 "assigned_rate_limits": { 00:14:01.382 "rw_ios_per_sec": 0, 00:14:01.382 "rw_mbytes_per_sec": 0, 00:14:01.382 "r_mbytes_per_sec": 0, 00:14:01.382 "w_mbytes_per_sec": 0 00:14:01.382 }, 00:14:01.382 "claimed": true, 00:14:01.382 "claim_type": "exclusive_write", 00:14:01.382 "zoned": false, 00:14:01.382 "supported_io_types": { 00:14:01.382 "read": true, 00:14:01.382 "write": true, 00:14:01.382 "unmap": true, 00:14:01.382 "write_zeroes": true, 00:14:01.382 "flush": true, 00:14:01.382 "reset": true, 00:14:01.382 "compare": false, 00:14:01.382 "compare_and_write": false, 00:14:01.382 "abort": true, 00:14:01.382 "nvme_admin": false, 00:14:01.382 "nvme_io": false 00:14:01.382 }, 00:14:01.382 "memory_domains": [ 00:14:01.382 { 00:14:01.382 "dma_device_id": "system", 00:14:01.382 "dma_device_type": 1 00:14:01.382 }, 00:14:01.382 { 00:14:01.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.382 "dma_device_type": 2 00:14:01.382 } 00:14:01.382 ], 00:14:01.382 "driver_specific": {} 00:14:01.382 }' 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.382 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:01.643 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:01.903 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:01.903 "name": "BaseBdev2", 00:14:01.903 "aliases": [ 00:14:01.903 "a64df65a-029a-4e20-b0d0-c9621cd6bfae" 00:14:01.903 ], 00:14:01.903 "product_name": "Malloc disk", 00:14:01.903 "block_size": 512, 00:14:01.903 "num_blocks": 65536, 00:14:01.903 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:14:01.903 "assigned_rate_limits": { 00:14:01.903 "rw_ios_per_sec": 0, 00:14:01.903 "rw_mbytes_per_sec": 0, 00:14:01.903 "r_mbytes_per_sec": 0, 00:14:01.903 "w_mbytes_per_sec": 0 00:14:01.903 }, 00:14:01.903 "claimed": true, 00:14:01.903 "claim_type": "exclusive_write", 00:14:01.903 "zoned": false, 00:14:01.903 "supported_io_types": { 00:14:01.903 "read": true, 00:14:01.903 "write": true, 00:14:01.903 "unmap": true, 00:14:01.903 "write_zeroes": true, 00:14:01.903 "flush": true, 00:14:01.903 "reset": true, 00:14:01.903 "compare": false, 00:14:01.903 "compare_and_write": false, 00:14:01.903 "abort": true, 00:14:01.903 "nvme_admin": false, 00:14:01.903 "nvme_io": false 00:14:01.903 }, 00:14:01.903 "memory_domains": [ 00:14:01.903 { 00:14:01.903 "dma_device_id": "system", 00:14:01.903 "dma_device_type": 1 00:14:01.903 }, 00:14:01.903 { 00:14:01.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.903 "dma_device_type": 2 00:14:01.903 } 00:14:01.903 ], 00:14:01.903 "driver_specific": {} 00:14:01.903 }' 00:14:01.903 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.903 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.903 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.903 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.163 10:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:02.423 "name": "BaseBdev3", 00:14:02.423 "aliases": [ 00:14:02.423 "b6165a75-054a-4eb2-927c-c18350eeaa45" 00:14:02.423 ], 00:14:02.423 "product_name": "Malloc disk", 00:14:02.423 "block_size": 512, 00:14:02.423 "num_blocks": 65536, 00:14:02.423 "uuid": "b6165a75-054a-4eb2-927c-c18350eeaa45", 00:14:02.423 "assigned_rate_limits": { 00:14:02.423 "rw_ios_per_sec": 0, 00:14:02.423 "rw_mbytes_per_sec": 0, 00:14:02.423 "r_mbytes_per_sec": 0, 00:14:02.423 "w_mbytes_per_sec": 0 00:14:02.423 }, 00:14:02.423 "claimed": true, 00:14:02.423 "claim_type": "exclusive_write", 00:14:02.423 "zoned": false, 00:14:02.423 "supported_io_types": { 00:14:02.423 "read": true, 00:14:02.423 "write": true, 00:14:02.423 "unmap": true, 00:14:02.423 "write_zeroes": true, 00:14:02.423 "flush": true, 00:14:02.423 "reset": true, 00:14:02.423 "compare": false, 00:14:02.423 "compare_and_write": false, 00:14:02.423 "abort": true, 00:14:02.423 "nvme_admin": false, 00:14:02.423 "nvme_io": false 00:14:02.423 }, 00:14:02.423 "memory_domains": [ 00:14:02.423 { 00:14:02.423 "dma_device_id": "system", 00:14:02.423 "dma_device_type": 1 00:14:02.423 }, 00:14:02.423 { 00:14:02.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.423 "dma_device_type": 2 00:14:02.423 } 00:14:02.423 ], 00:14:02.423 "driver_specific": {} 00:14:02.423 }' 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:02.423 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.683 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.943 [2024-06-10 10:09:24.705189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:02.943 [2024-06-10 10:09:24.705206] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:02.943 [2024-06-10 10:09:24.705240] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.943 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.944 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.944 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.944 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.944 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.204 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.204 "name": "Existed_Raid", 00:14:03.204 "uuid": "af7ac639-c511-4cd1-80d2-131b4f42f5d0", 00:14:03.204 "strip_size_kb": 64, 00:14:03.204 "state": "offline", 00:14:03.204 "raid_level": "concat", 00:14:03.204 "superblock": true, 00:14:03.204 "num_base_bdevs": 3, 00:14:03.204 "num_base_bdevs_discovered": 2, 00:14:03.204 "num_base_bdevs_operational": 2, 00:14:03.204 "base_bdevs_list": [ 00:14:03.204 { 00:14:03.204 "name": null, 00:14:03.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.204 "is_configured": false, 00:14:03.204 "data_offset": 2048, 00:14:03.204 "data_size": 63488 00:14:03.204 }, 00:14:03.204 { 00:14:03.204 "name": "BaseBdev2", 00:14:03.204 "uuid": "a64df65a-029a-4e20-b0d0-c9621cd6bfae", 00:14:03.204 "is_configured": true, 00:14:03.204 "data_offset": 2048, 00:14:03.204 "data_size": 63488 00:14:03.204 }, 00:14:03.204 { 00:14:03.204 "name": "BaseBdev3", 00:14:03.204 "uuid": "b6165a75-054a-4eb2-927c-c18350eeaa45", 00:14:03.204 "is_configured": true, 00:14:03.204 "data_offset": 2048, 00:14:03.204 "data_size": 63488 00:14:03.204 } 00:14:03.204 ] 00:14:03.204 }' 00:14:03.204 10:09:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.204 10:09:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:03.774 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:04.034 [2024-06-10 10:09:25.751840] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:04.034 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:04.034 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:04.034 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.034 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:04.294 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:04.294 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:04.294 10:09:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:04.294 [2024-06-10 10:09:26.086495] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:04.294 [2024-06-10 10:09:26.086526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabc2c0 name Existed_Raid, state offline 00:14:04.294 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:04.294 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:04.294 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.294 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:04.554 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:04.815 BaseBdev2 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.815 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:05.075 [ 00:14:05.075 { 00:14:05.075 "name": "BaseBdev2", 00:14:05.075 "aliases": [ 00:14:05.075 "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc" 00:14:05.075 ], 00:14:05.075 "product_name": "Malloc disk", 00:14:05.075 "block_size": 512, 00:14:05.075 "num_blocks": 65536, 00:14:05.075 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:05.075 "assigned_rate_limits": { 00:14:05.075 "rw_ios_per_sec": 0, 00:14:05.075 "rw_mbytes_per_sec": 0, 00:14:05.075 "r_mbytes_per_sec": 0, 00:14:05.075 "w_mbytes_per_sec": 0 00:14:05.075 }, 00:14:05.075 "claimed": false, 00:14:05.075 "zoned": false, 00:14:05.075 "supported_io_types": { 00:14:05.075 "read": true, 00:14:05.075 "write": true, 00:14:05.075 "unmap": true, 00:14:05.075 "write_zeroes": true, 00:14:05.075 "flush": true, 00:14:05.075 "reset": true, 00:14:05.075 "compare": false, 00:14:05.075 "compare_and_write": false, 00:14:05.075 "abort": true, 00:14:05.075 "nvme_admin": false, 00:14:05.075 "nvme_io": false 00:14:05.075 }, 00:14:05.075 "memory_domains": [ 00:14:05.075 { 00:14:05.075 "dma_device_id": "system", 00:14:05.075 "dma_device_type": 1 00:14:05.075 }, 00:14:05.075 { 00:14:05.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.075 "dma_device_type": 2 00:14:05.075 } 00:14:05.075 ], 00:14:05.076 "driver_specific": {} 00:14:05.076 } 00:14:05.076 ] 00:14:05.076 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:05.076 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:05.076 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:05.076 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:05.335 BaseBdev3 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:05.335 10:09:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:05.335 10:09:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:05.595 [ 00:14:05.595 { 00:14:05.595 "name": "BaseBdev3", 00:14:05.595 "aliases": [ 00:14:05.595 "289a8832-4357-4812-a4c6-8c94f69e420d" 00:14:05.595 ], 00:14:05.595 "product_name": "Malloc disk", 00:14:05.595 "block_size": 512, 00:14:05.595 "num_blocks": 65536, 00:14:05.595 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:05.595 "assigned_rate_limits": { 00:14:05.595 "rw_ios_per_sec": 0, 00:14:05.595 "rw_mbytes_per_sec": 0, 00:14:05.595 "r_mbytes_per_sec": 0, 00:14:05.596 "w_mbytes_per_sec": 0 00:14:05.596 }, 00:14:05.596 "claimed": false, 00:14:05.596 "zoned": false, 00:14:05.596 "supported_io_types": { 00:14:05.596 "read": true, 00:14:05.596 "write": true, 00:14:05.596 "unmap": true, 00:14:05.596 "write_zeroes": true, 00:14:05.596 "flush": true, 00:14:05.596 "reset": true, 00:14:05.596 "compare": false, 00:14:05.596 "compare_and_write": false, 00:14:05.596 "abort": true, 00:14:05.596 "nvme_admin": false, 00:14:05.596 "nvme_io": false 00:14:05.596 }, 00:14:05.596 "memory_domains": [ 00:14:05.596 { 00:14:05.596 "dma_device_id": "system", 00:14:05.596 "dma_device_type": 1 00:14:05.596 }, 00:14:05.596 { 00:14:05.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.596 "dma_device_type": 2 00:14:05.596 } 00:14:05.596 ], 00:14:05.596 "driver_specific": {} 00:14:05.596 } 00:14:05.596 ] 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:05.596 [2024-06-10 10:09:27.409868] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:05.596 [2024-06-10 10:09:27.409899] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:05.596 [2024-06-10 10:09:27.409912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:05.596 [2024-06-10 10:09:27.410937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.596 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.856 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.856 "name": "Existed_Raid", 00:14:05.856 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:05.856 "strip_size_kb": 64, 00:14:05.856 "state": "configuring", 00:14:05.856 "raid_level": "concat", 00:14:05.856 "superblock": true, 00:14:05.856 "num_base_bdevs": 3, 00:14:05.856 "num_base_bdevs_discovered": 2, 00:14:05.856 "num_base_bdevs_operational": 3, 00:14:05.856 "base_bdevs_list": [ 00:14:05.856 { 00:14:05.856 "name": "BaseBdev1", 00:14:05.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.856 "is_configured": false, 00:14:05.856 "data_offset": 0, 00:14:05.856 "data_size": 0 00:14:05.856 }, 00:14:05.856 { 00:14:05.856 "name": "BaseBdev2", 00:14:05.856 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:05.856 "is_configured": true, 00:14:05.856 "data_offset": 2048, 00:14:05.856 "data_size": 63488 00:14:05.856 }, 00:14:05.856 { 00:14:05.856 "name": "BaseBdev3", 00:14:05.856 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:05.856 "is_configured": true, 00:14:05.856 "data_offset": 2048, 00:14:05.856 "data_size": 63488 00:14:05.856 } 00:14:05.856 ] 00:14:05.856 }' 00:14:05.856 10:09:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.856 10:09:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:06.425 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:06.685 [2024-06-10 10:09:28.320148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.685 "name": "Existed_Raid", 00:14:06.685 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:06.685 "strip_size_kb": 64, 00:14:06.685 "state": "configuring", 00:14:06.685 "raid_level": "concat", 00:14:06.685 "superblock": true, 00:14:06.685 "num_base_bdevs": 3, 00:14:06.685 "num_base_bdevs_discovered": 1, 00:14:06.685 "num_base_bdevs_operational": 3, 00:14:06.685 "base_bdevs_list": [ 00:14:06.685 { 00:14:06.685 "name": "BaseBdev1", 00:14:06.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.685 "is_configured": false, 00:14:06.685 "data_offset": 0, 00:14:06.685 "data_size": 0 00:14:06.685 }, 00:14:06.685 { 00:14:06.685 "name": null, 00:14:06.685 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:06.685 "is_configured": false, 00:14:06.685 "data_offset": 2048, 00:14:06.685 "data_size": 63488 00:14:06.685 }, 00:14:06.685 { 00:14:06.685 "name": "BaseBdev3", 00:14:06.685 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:06.685 "is_configured": true, 00:14:06.685 "data_offset": 2048, 00:14:06.685 "data_size": 63488 00:14:06.685 } 00:14:06.685 ] 00:14:06.685 }' 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.685 10:09:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:07.255 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.255 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:07.516 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:07.516 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:07.776 [2024-06-10 10:09:29.435947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:07.776 BaseBdev1 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.776 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:08.035 [ 00:14:08.035 { 00:14:08.035 "name": "BaseBdev1", 00:14:08.035 "aliases": [ 00:14:08.036 "1878d6ad-6a4c-476d-bc57-0c65297b4421" 00:14:08.036 ], 00:14:08.036 "product_name": "Malloc disk", 00:14:08.036 "block_size": 512, 00:14:08.036 "num_blocks": 65536, 00:14:08.036 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:08.036 "assigned_rate_limits": { 00:14:08.036 "rw_ios_per_sec": 0, 00:14:08.036 "rw_mbytes_per_sec": 0, 00:14:08.036 "r_mbytes_per_sec": 0, 00:14:08.036 "w_mbytes_per_sec": 0 00:14:08.036 }, 00:14:08.036 "claimed": true, 00:14:08.036 "claim_type": "exclusive_write", 00:14:08.036 "zoned": false, 00:14:08.036 "supported_io_types": { 00:14:08.036 "read": true, 00:14:08.036 "write": true, 00:14:08.036 "unmap": true, 00:14:08.036 "write_zeroes": true, 00:14:08.036 "flush": true, 00:14:08.036 "reset": true, 00:14:08.036 "compare": false, 00:14:08.036 "compare_and_write": false, 00:14:08.036 "abort": true, 00:14:08.036 "nvme_admin": false, 00:14:08.036 "nvme_io": false 00:14:08.036 }, 00:14:08.036 "memory_domains": [ 00:14:08.036 { 00:14:08.036 "dma_device_id": "system", 00:14:08.036 "dma_device_type": 1 00:14:08.036 }, 00:14:08.036 { 00:14:08.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.036 "dma_device_type": 2 00:14:08.036 } 00:14:08.036 ], 00:14:08.036 "driver_specific": {} 00:14:08.036 } 00:14:08.036 ] 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.036 10:09:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.296 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.296 "name": "Existed_Raid", 00:14:08.296 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:08.296 "strip_size_kb": 64, 00:14:08.296 "state": "configuring", 00:14:08.296 "raid_level": "concat", 00:14:08.296 "superblock": true, 00:14:08.296 "num_base_bdevs": 3, 00:14:08.296 "num_base_bdevs_discovered": 2, 00:14:08.296 "num_base_bdevs_operational": 3, 00:14:08.296 "base_bdevs_list": [ 00:14:08.296 { 00:14:08.296 "name": "BaseBdev1", 00:14:08.296 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:08.296 "is_configured": true, 00:14:08.296 "data_offset": 2048, 00:14:08.296 "data_size": 63488 00:14:08.296 }, 00:14:08.296 { 00:14:08.296 "name": null, 00:14:08.296 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:08.296 "is_configured": false, 00:14:08.296 "data_offset": 2048, 00:14:08.296 "data_size": 63488 00:14:08.296 }, 00:14:08.296 { 00:14:08.296 "name": "BaseBdev3", 00:14:08.296 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:08.296 "is_configured": true, 00:14:08.296 "data_offset": 2048, 00:14:08.296 "data_size": 63488 00:14:08.296 } 00:14:08.296 ] 00:14:08.296 }' 00:14:08.296 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.296 10:09:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.866 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:08.866 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.125 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:09.126 [2024-06-10 10:09:30.907699] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.126 10:09:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.385 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.385 "name": "Existed_Raid", 00:14:09.385 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:09.385 "strip_size_kb": 64, 00:14:09.385 "state": "configuring", 00:14:09.385 "raid_level": "concat", 00:14:09.385 "superblock": true, 00:14:09.385 "num_base_bdevs": 3, 00:14:09.385 "num_base_bdevs_discovered": 1, 00:14:09.385 "num_base_bdevs_operational": 3, 00:14:09.385 "base_bdevs_list": [ 00:14:09.385 { 00:14:09.385 "name": "BaseBdev1", 00:14:09.385 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:09.385 "is_configured": true, 00:14:09.385 "data_offset": 2048, 00:14:09.385 "data_size": 63488 00:14:09.385 }, 00:14:09.385 { 00:14:09.385 "name": null, 00:14:09.385 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:09.385 "is_configured": false, 00:14:09.385 "data_offset": 2048, 00:14:09.385 "data_size": 63488 00:14:09.385 }, 00:14:09.385 { 00:14:09.385 "name": null, 00:14:09.385 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:09.385 "is_configured": false, 00:14:09.385 "data_offset": 2048, 00:14:09.385 "data_size": 63488 00:14:09.385 } 00:14:09.385 ] 00:14:09.385 }' 00:14:09.385 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.385 10:09:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.955 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.955 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:10.215 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:10.215 10:09:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:10.215 [2024-06-10 10:09:32.010513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.215 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.476 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.476 "name": "Existed_Raid", 00:14:10.476 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:10.476 "strip_size_kb": 64, 00:14:10.476 "state": "configuring", 00:14:10.476 "raid_level": "concat", 00:14:10.476 "superblock": true, 00:14:10.476 "num_base_bdevs": 3, 00:14:10.476 "num_base_bdevs_discovered": 2, 00:14:10.476 "num_base_bdevs_operational": 3, 00:14:10.476 "base_bdevs_list": [ 00:14:10.476 { 00:14:10.476 "name": "BaseBdev1", 00:14:10.476 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:10.476 "is_configured": true, 00:14:10.476 "data_offset": 2048, 00:14:10.476 "data_size": 63488 00:14:10.476 }, 00:14:10.476 { 00:14:10.476 "name": null, 00:14:10.476 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:10.476 "is_configured": false, 00:14:10.476 "data_offset": 2048, 00:14:10.476 "data_size": 63488 00:14:10.476 }, 00:14:10.476 { 00:14:10.476 "name": "BaseBdev3", 00:14:10.476 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:10.476 "is_configured": true, 00:14:10.476 "data_offset": 2048, 00:14:10.476 "data_size": 63488 00:14:10.476 } 00:14:10.476 ] 00:14:10.476 }' 00:14:10.476 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.476 10:09:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.046 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.046 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:11.046 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:11.046 10:09:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:11.306 [2024-06-10 10:09:33.077356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.306 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.567 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.567 "name": "Existed_Raid", 00:14:11.567 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:11.567 "strip_size_kb": 64, 00:14:11.567 "state": "configuring", 00:14:11.567 "raid_level": "concat", 00:14:11.567 "superblock": true, 00:14:11.567 "num_base_bdevs": 3, 00:14:11.567 "num_base_bdevs_discovered": 1, 00:14:11.567 "num_base_bdevs_operational": 3, 00:14:11.567 "base_bdevs_list": [ 00:14:11.567 { 00:14:11.567 "name": null, 00:14:11.567 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:11.567 "is_configured": false, 00:14:11.567 "data_offset": 2048, 00:14:11.567 "data_size": 63488 00:14:11.567 }, 00:14:11.567 { 00:14:11.567 "name": null, 00:14:11.567 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:11.567 "is_configured": false, 00:14:11.567 "data_offset": 2048, 00:14:11.567 "data_size": 63488 00:14:11.567 }, 00:14:11.567 { 00:14:11.567 "name": "BaseBdev3", 00:14:11.567 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:11.567 "is_configured": true, 00:14:11.567 "data_offset": 2048, 00:14:11.567 "data_size": 63488 00:14:11.567 } 00:14:11.567 ] 00:14:11.567 }' 00:14:11.567 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.567 10:09:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.137 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.137 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:12.137 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:12.137 10:09:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:12.397 [2024-06-10 10:09:34.141799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.397 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.398 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.398 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.398 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.658 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.658 "name": "Existed_Raid", 00:14:12.658 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:12.658 "strip_size_kb": 64, 00:14:12.658 "state": "configuring", 00:14:12.658 "raid_level": "concat", 00:14:12.658 "superblock": true, 00:14:12.658 "num_base_bdevs": 3, 00:14:12.658 "num_base_bdevs_discovered": 2, 00:14:12.658 "num_base_bdevs_operational": 3, 00:14:12.658 "base_bdevs_list": [ 00:14:12.658 { 00:14:12.658 "name": null, 00:14:12.658 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:12.658 "is_configured": false, 00:14:12.658 "data_offset": 2048, 00:14:12.658 "data_size": 63488 00:14:12.658 }, 00:14:12.658 { 00:14:12.658 "name": "BaseBdev2", 00:14:12.658 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:12.658 "is_configured": true, 00:14:12.658 "data_offset": 2048, 00:14:12.658 "data_size": 63488 00:14:12.658 }, 00:14:12.658 { 00:14:12.658 "name": "BaseBdev3", 00:14:12.658 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:12.658 "is_configured": true, 00:14:12.658 "data_offset": 2048, 00:14:12.658 "data_size": 63488 00:14:12.658 } 00:14:12.658 ] 00:14:12.658 }' 00:14:12.658 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.658 10:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.228 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.228 10:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:13.228 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:13.228 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.228 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:13.488 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1878d6ad-6a4c-476d-bc57-0c65297b4421 00:14:13.748 [2024-06-10 10:09:35.450062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:13.748 [2024-06-10 10:09:35.450172] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xabad10 00:14:13.748 [2024-06-10 10:09:35.450179] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:13.748 [2024-06-10 10:09:35.450313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7b9240 00:14:13.748 [2024-06-10 10:09:35.450404] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xabad10 00:14:13.748 [2024-06-10 10:09:35.450409] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xabad10 00:14:13.748 [2024-06-10 10:09:35.450476] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:13.748 NewBaseBdev 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:13.748 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:14.008 [ 00:14:14.008 { 00:14:14.008 "name": "NewBaseBdev", 00:14:14.008 "aliases": [ 00:14:14.008 "1878d6ad-6a4c-476d-bc57-0c65297b4421" 00:14:14.008 ], 00:14:14.008 "product_name": "Malloc disk", 00:14:14.008 "block_size": 512, 00:14:14.008 "num_blocks": 65536, 00:14:14.008 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:14.008 "assigned_rate_limits": { 00:14:14.008 "rw_ios_per_sec": 0, 00:14:14.008 "rw_mbytes_per_sec": 0, 00:14:14.008 "r_mbytes_per_sec": 0, 00:14:14.008 "w_mbytes_per_sec": 0 00:14:14.008 }, 00:14:14.008 "claimed": true, 00:14:14.008 "claim_type": "exclusive_write", 00:14:14.008 "zoned": false, 00:14:14.008 "supported_io_types": { 00:14:14.008 "read": true, 00:14:14.008 "write": true, 00:14:14.008 "unmap": true, 00:14:14.008 "write_zeroes": true, 00:14:14.008 "flush": true, 00:14:14.008 "reset": true, 00:14:14.008 "compare": false, 00:14:14.008 "compare_and_write": false, 00:14:14.008 "abort": true, 00:14:14.008 "nvme_admin": false, 00:14:14.008 "nvme_io": false 00:14:14.008 }, 00:14:14.008 "memory_domains": [ 00:14:14.008 { 00:14:14.008 "dma_device_id": "system", 00:14:14.008 "dma_device_type": 1 00:14:14.008 }, 00:14:14.008 { 00:14:14.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.008 "dma_device_type": 2 00:14:14.008 } 00:14:14.008 ], 00:14:14.008 "driver_specific": {} 00:14:14.008 } 00:14:14.008 ] 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.008 10:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:14.268 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.268 "name": "Existed_Raid", 00:14:14.268 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:14.268 "strip_size_kb": 64, 00:14:14.268 "state": "online", 00:14:14.268 "raid_level": "concat", 00:14:14.268 "superblock": true, 00:14:14.268 "num_base_bdevs": 3, 00:14:14.268 "num_base_bdevs_discovered": 3, 00:14:14.268 "num_base_bdevs_operational": 3, 00:14:14.268 "base_bdevs_list": [ 00:14:14.268 { 00:14:14.268 "name": "NewBaseBdev", 00:14:14.268 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:14.268 "is_configured": true, 00:14:14.268 "data_offset": 2048, 00:14:14.268 "data_size": 63488 00:14:14.268 }, 00:14:14.268 { 00:14:14.268 "name": "BaseBdev2", 00:14:14.268 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:14.268 "is_configured": true, 00:14:14.268 "data_offset": 2048, 00:14:14.268 "data_size": 63488 00:14:14.268 }, 00:14:14.268 { 00:14:14.268 "name": "BaseBdev3", 00:14:14.268 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:14.268 "is_configured": true, 00:14:14.268 "data_offset": 2048, 00:14:14.268 "data_size": 63488 00:14:14.268 } 00:14:14.268 ] 00:14:14.268 }' 00:14:14.268 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.268 10:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:14.838 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:15.098 [2024-06-10 10:09:36.749558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:15.098 "name": "Existed_Raid", 00:14:15.098 "aliases": [ 00:14:15.098 "be598d47-137a-4848-b5e9-8db3e01165c7" 00:14:15.098 ], 00:14:15.098 "product_name": "Raid Volume", 00:14:15.098 "block_size": 512, 00:14:15.098 "num_blocks": 190464, 00:14:15.098 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:15.098 "assigned_rate_limits": { 00:14:15.098 "rw_ios_per_sec": 0, 00:14:15.098 "rw_mbytes_per_sec": 0, 00:14:15.098 "r_mbytes_per_sec": 0, 00:14:15.098 "w_mbytes_per_sec": 0 00:14:15.098 }, 00:14:15.098 "claimed": false, 00:14:15.098 "zoned": false, 00:14:15.098 "supported_io_types": { 00:14:15.098 "read": true, 00:14:15.098 "write": true, 00:14:15.098 "unmap": true, 00:14:15.098 "write_zeroes": true, 00:14:15.098 "flush": true, 00:14:15.098 "reset": true, 00:14:15.098 "compare": false, 00:14:15.098 "compare_and_write": false, 00:14:15.098 "abort": false, 00:14:15.098 "nvme_admin": false, 00:14:15.098 "nvme_io": false 00:14:15.098 }, 00:14:15.098 "memory_domains": [ 00:14:15.098 { 00:14:15.098 "dma_device_id": "system", 00:14:15.098 "dma_device_type": 1 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.098 "dma_device_type": 2 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "dma_device_id": "system", 00:14:15.098 "dma_device_type": 1 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.098 "dma_device_type": 2 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "dma_device_id": "system", 00:14:15.098 "dma_device_type": 1 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.098 "dma_device_type": 2 00:14:15.098 } 00:14:15.098 ], 00:14:15.098 "driver_specific": { 00:14:15.098 "raid": { 00:14:15.098 "uuid": "be598d47-137a-4848-b5e9-8db3e01165c7", 00:14:15.098 "strip_size_kb": 64, 00:14:15.098 "state": "online", 00:14:15.098 "raid_level": "concat", 00:14:15.098 "superblock": true, 00:14:15.098 "num_base_bdevs": 3, 00:14:15.098 "num_base_bdevs_discovered": 3, 00:14:15.098 "num_base_bdevs_operational": 3, 00:14:15.098 "base_bdevs_list": [ 00:14:15.098 { 00:14:15.098 "name": "NewBaseBdev", 00:14:15.098 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:15.098 "is_configured": true, 00:14:15.098 "data_offset": 2048, 00:14:15.098 "data_size": 63488 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "name": "BaseBdev2", 00:14:15.098 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:15.098 "is_configured": true, 00:14:15.098 "data_offset": 2048, 00:14:15.098 "data_size": 63488 00:14:15.098 }, 00:14:15.098 { 00:14:15.098 "name": "BaseBdev3", 00:14:15.098 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:15.098 "is_configured": true, 00:14:15.098 "data_offset": 2048, 00:14:15.098 "data_size": 63488 00:14:15.098 } 00:14:15.098 ] 00:14:15.098 } 00:14:15.098 } 00:14:15.098 }' 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:15.098 BaseBdev2 00:14:15.098 BaseBdev3' 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:15.098 10:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.358 "name": "NewBaseBdev", 00:14:15.358 "aliases": [ 00:14:15.358 "1878d6ad-6a4c-476d-bc57-0c65297b4421" 00:14:15.358 ], 00:14:15.358 "product_name": "Malloc disk", 00:14:15.358 "block_size": 512, 00:14:15.358 "num_blocks": 65536, 00:14:15.358 "uuid": "1878d6ad-6a4c-476d-bc57-0c65297b4421", 00:14:15.358 "assigned_rate_limits": { 00:14:15.358 "rw_ios_per_sec": 0, 00:14:15.358 "rw_mbytes_per_sec": 0, 00:14:15.358 "r_mbytes_per_sec": 0, 00:14:15.358 "w_mbytes_per_sec": 0 00:14:15.358 }, 00:14:15.358 "claimed": true, 00:14:15.358 "claim_type": "exclusive_write", 00:14:15.358 "zoned": false, 00:14:15.358 "supported_io_types": { 00:14:15.358 "read": true, 00:14:15.358 "write": true, 00:14:15.358 "unmap": true, 00:14:15.358 "write_zeroes": true, 00:14:15.358 "flush": true, 00:14:15.358 "reset": true, 00:14:15.358 "compare": false, 00:14:15.358 "compare_and_write": false, 00:14:15.358 "abort": true, 00:14:15.358 "nvme_admin": false, 00:14:15.358 "nvme_io": false 00:14:15.358 }, 00:14:15.358 "memory_domains": [ 00:14:15.358 { 00:14:15.358 "dma_device_id": "system", 00:14:15.358 "dma_device_type": 1 00:14:15.358 }, 00:14:15.358 { 00:14:15.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.358 "dma_device_type": 2 00:14:15.358 } 00:14:15.358 ], 00:14:15.358 "driver_specific": {} 00:14:15.358 }' 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.358 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.359 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.359 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.359 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.618 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:15.878 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.878 "name": "BaseBdev2", 00:14:15.878 "aliases": [ 00:14:15.878 "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc" 00:14:15.878 ], 00:14:15.878 "product_name": "Malloc disk", 00:14:15.878 "block_size": 512, 00:14:15.878 "num_blocks": 65536, 00:14:15.878 "uuid": "a93c4f79-f1c4-4f0f-888c-cb4ac84f2ccc", 00:14:15.878 "assigned_rate_limits": { 00:14:15.878 "rw_ios_per_sec": 0, 00:14:15.878 "rw_mbytes_per_sec": 0, 00:14:15.878 "r_mbytes_per_sec": 0, 00:14:15.878 "w_mbytes_per_sec": 0 00:14:15.878 }, 00:14:15.878 "claimed": true, 00:14:15.878 "claim_type": "exclusive_write", 00:14:15.878 "zoned": false, 00:14:15.878 "supported_io_types": { 00:14:15.878 "read": true, 00:14:15.878 "write": true, 00:14:15.878 "unmap": true, 00:14:15.878 "write_zeroes": true, 00:14:15.878 "flush": true, 00:14:15.878 "reset": true, 00:14:15.878 "compare": false, 00:14:15.878 "compare_and_write": false, 00:14:15.878 "abort": true, 00:14:15.878 "nvme_admin": false, 00:14:15.878 "nvme_io": false 00:14:15.878 }, 00:14:15.878 "memory_domains": [ 00:14:15.878 { 00:14:15.878 "dma_device_id": "system", 00:14:15.878 "dma_device_type": 1 00:14:15.878 }, 00:14:15.878 { 00:14:15.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.878 "dma_device_type": 2 00:14:15.878 } 00:14:15.878 ], 00:14:15.878 "driver_specific": {} 00:14:15.878 }' 00:14:15.878 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.878 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.878 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.878 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.879 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.879 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.879 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.879 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:16.139 10:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:16.399 "name": "BaseBdev3", 00:14:16.399 "aliases": [ 00:14:16.399 "289a8832-4357-4812-a4c6-8c94f69e420d" 00:14:16.399 ], 00:14:16.399 "product_name": "Malloc disk", 00:14:16.399 "block_size": 512, 00:14:16.399 "num_blocks": 65536, 00:14:16.399 "uuid": "289a8832-4357-4812-a4c6-8c94f69e420d", 00:14:16.399 "assigned_rate_limits": { 00:14:16.399 "rw_ios_per_sec": 0, 00:14:16.399 "rw_mbytes_per_sec": 0, 00:14:16.399 "r_mbytes_per_sec": 0, 00:14:16.399 "w_mbytes_per_sec": 0 00:14:16.399 }, 00:14:16.399 "claimed": true, 00:14:16.399 "claim_type": "exclusive_write", 00:14:16.399 "zoned": false, 00:14:16.399 "supported_io_types": { 00:14:16.399 "read": true, 00:14:16.399 "write": true, 00:14:16.399 "unmap": true, 00:14:16.399 "write_zeroes": true, 00:14:16.399 "flush": true, 00:14:16.399 "reset": true, 00:14:16.399 "compare": false, 00:14:16.399 "compare_and_write": false, 00:14:16.399 "abort": true, 00:14:16.399 "nvme_admin": false, 00:14:16.399 "nvme_io": false 00:14:16.399 }, 00:14:16.399 "memory_domains": [ 00:14:16.399 { 00:14:16.399 "dma_device_id": "system", 00:14:16.399 "dma_device_type": 1 00:14:16.399 }, 00:14:16.399 { 00:14:16.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.399 "dma_device_type": 2 00:14:16.399 } 00:14:16.399 ], 00:14:16.399 "driver_specific": {} 00:14:16.399 }' 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.399 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.694 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.694 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.694 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.694 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.694 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:16.694 [2024-06-10 10:09:38.529860] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:16.694 [2024-06-10 10:09:38.529878] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:16.694 [2024-06-10 10:09:38.529913] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:16.694 [2024-06-10 10:09:38.529953] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:16.694 [2024-06-10 10:09:38.529959] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabad10 name Existed_Raid, state offline 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 997459 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 997459 ']' 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 997459 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 997459 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 997459' 00:14:16.977 killing process with pid 997459 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 997459 00:14:16.977 [2024-06-10 10:09:38.595008] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 997459 00:14:16.977 [2024-06-10 10:09:38.609688] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:16.977 00:14:16.977 real 0m22.913s 00:14:16.977 user 0m42.989s 00:14:16.977 sys 0m3.346s 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:16.977 10:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.977 ************************************ 00:14:16.977 END TEST raid_state_function_test_sb 00:14:16.977 ************************************ 00:14:16.977 10:09:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:16.977 10:09:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:14:16.977 10:09:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:16.977 10:09:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:16.977 ************************************ 00:14:16.977 START TEST raid_superblock_test 00:14:16.977 ************************************ 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 3 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1001976 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1001976 /var/tmp/spdk-raid.sock 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1001976 ']' 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:16.977 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:16.977 10:09:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.237 [2024-06-10 10:09:38.869645] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:14:17.237 [2024-06-10 10:09:38.869734] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001976 ] 00:14:17.237 [2024-06-10 10:09:38.959482] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:17.237 [2024-06-10 10:09:39.023673] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:17.237 [2024-06-10 10:09:39.064512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:17.237 [2024-06-10 10:09:39.064534] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:18.178 malloc1 00:14:18.178 10:09:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:18.438 [2024-06-10 10:09:40.066759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:18.438 [2024-06-10 10:09:40.066798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.438 [2024-06-10 10:09:40.066810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24b9990 00:14:18.438 [2024-06-10 10:09:40.066817] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.438 [2024-06-10 10:09:40.068177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.438 [2024-06-10 10:09:40.068197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:18.438 pt1 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:18.438 malloc2 00:14:18.438 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:18.697 [2024-06-10 10:09:40.465587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:18.697 [2024-06-10 10:09:40.465618] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.697 [2024-06-10 10:09:40.465629] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24ba4e0 00:14:18.697 [2024-06-10 10:09:40.465636] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.697 [2024-06-10 10:09:40.466819] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.697 [2024-06-10 10:09:40.466844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:18.697 pt2 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:18.697 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:18.956 malloc3 00:14:18.956 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:19.215 [2024-06-10 10:09:40.848428] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:19.215 [2024-06-10 10:09:40.848456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.215 [2024-06-10 10:09:40.848464] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26664e0 00:14:19.215 [2024-06-10 10:09:40.848470] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.215 [2024-06-10 10:09:40.849653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.215 [2024-06-10 10:09:40.849670] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:19.215 pt3 00:14:19.215 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:19.215 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:19.215 10:09:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:19.215 [2024-06-10 10:09:41.036917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:19.215 [2024-06-10 10:09:41.037920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:19.215 [2024-06-10 10:09:41.037961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:19.215 [2024-06-10 10:09:41.038076] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2667830 00:14:19.215 [2024-06-10 10:09:41.038083] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:19.215 [2024-06-10 10:09:41.038231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2185dc0 00:14:19.215 [2024-06-10 10:09:41.038335] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2667830 00:14:19.215 [2024-06-10 10:09:41.038341] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2667830 00:14:19.215 [2024-06-10 10:09:41.038409] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.215 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.475 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.475 "name": "raid_bdev1", 00:14:19.475 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:19.475 "strip_size_kb": 64, 00:14:19.475 "state": "online", 00:14:19.475 "raid_level": "concat", 00:14:19.475 "superblock": true, 00:14:19.475 "num_base_bdevs": 3, 00:14:19.475 "num_base_bdevs_discovered": 3, 00:14:19.475 "num_base_bdevs_operational": 3, 00:14:19.475 "base_bdevs_list": [ 00:14:19.475 { 00:14:19.475 "name": "pt1", 00:14:19.475 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:19.475 "is_configured": true, 00:14:19.475 "data_offset": 2048, 00:14:19.475 "data_size": 63488 00:14:19.475 }, 00:14:19.475 { 00:14:19.475 "name": "pt2", 00:14:19.475 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.475 "is_configured": true, 00:14:19.475 "data_offset": 2048, 00:14:19.475 "data_size": 63488 00:14:19.475 }, 00:14:19.475 { 00:14:19.475 "name": "pt3", 00:14:19.475 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:19.475 "is_configured": true, 00:14:19.475 "data_offset": 2048, 00:14:19.475 "data_size": 63488 00:14:19.475 } 00:14:19.475 ] 00:14:19.475 }' 00:14:19.475 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.475 10:09:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.045 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:20.305 [2024-06-10 10:09:41.951410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.305 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:20.305 "name": "raid_bdev1", 00:14:20.305 "aliases": [ 00:14:20.305 "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d" 00:14:20.305 ], 00:14:20.305 "product_name": "Raid Volume", 00:14:20.305 "block_size": 512, 00:14:20.305 "num_blocks": 190464, 00:14:20.305 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:20.305 "assigned_rate_limits": { 00:14:20.305 "rw_ios_per_sec": 0, 00:14:20.305 "rw_mbytes_per_sec": 0, 00:14:20.305 "r_mbytes_per_sec": 0, 00:14:20.305 "w_mbytes_per_sec": 0 00:14:20.305 }, 00:14:20.305 "claimed": false, 00:14:20.305 "zoned": false, 00:14:20.305 "supported_io_types": { 00:14:20.305 "read": true, 00:14:20.305 "write": true, 00:14:20.305 "unmap": true, 00:14:20.305 "write_zeroes": true, 00:14:20.305 "flush": true, 00:14:20.305 "reset": true, 00:14:20.305 "compare": false, 00:14:20.305 "compare_and_write": false, 00:14:20.305 "abort": false, 00:14:20.305 "nvme_admin": false, 00:14:20.305 "nvme_io": false 00:14:20.305 }, 00:14:20.305 "memory_domains": [ 00:14:20.305 { 00:14:20.305 "dma_device_id": "system", 00:14:20.305 "dma_device_type": 1 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.305 "dma_device_type": 2 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "dma_device_id": "system", 00:14:20.305 "dma_device_type": 1 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.305 "dma_device_type": 2 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "dma_device_id": "system", 00:14:20.305 "dma_device_type": 1 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.305 "dma_device_type": 2 00:14:20.305 } 00:14:20.305 ], 00:14:20.305 "driver_specific": { 00:14:20.305 "raid": { 00:14:20.305 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:20.305 "strip_size_kb": 64, 00:14:20.305 "state": "online", 00:14:20.305 "raid_level": "concat", 00:14:20.305 "superblock": true, 00:14:20.305 "num_base_bdevs": 3, 00:14:20.305 "num_base_bdevs_discovered": 3, 00:14:20.305 "num_base_bdevs_operational": 3, 00:14:20.305 "base_bdevs_list": [ 00:14:20.305 { 00:14:20.305 "name": "pt1", 00:14:20.305 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.305 "is_configured": true, 00:14:20.305 "data_offset": 2048, 00:14:20.305 "data_size": 63488 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "name": "pt2", 00:14:20.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.305 "is_configured": true, 00:14:20.305 "data_offset": 2048, 00:14:20.305 "data_size": 63488 00:14:20.305 }, 00:14:20.305 { 00:14:20.305 "name": "pt3", 00:14:20.305 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.305 "is_configured": true, 00:14:20.305 "data_offset": 2048, 00:14:20.305 "data_size": 63488 00:14:20.305 } 00:14:20.305 ] 00:14:20.305 } 00:14:20.305 } 00:14:20.305 }' 00:14:20.305 10:09:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.305 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:20.305 pt2 00:14:20.305 pt3' 00:14:20.305 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.305 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:20.305 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.564 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.564 "name": "pt1", 00:14:20.564 "aliases": [ 00:14:20.564 "00000000-0000-0000-0000-000000000001" 00:14:20.564 ], 00:14:20.564 "product_name": "passthru", 00:14:20.564 "block_size": 512, 00:14:20.565 "num_blocks": 65536, 00:14:20.565 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.565 "assigned_rate_limits": { 00:14:20.565 "rw_ios_per_sec": 0, 00:14:20.565 "rw_mbytes_per_sec": 0, 00:14:20.565 "r_mbytes_per_sec": 0, 00:14:20.565 "w_mbytes_per_sec": 0 00:14:20.565 }, 00:14:20.565 "claimed": true, 00:14:20.565 "claim_type": "exclusive_write", 00:14:20.565 "zoned": false, 00:14:20.565 "supported_io_types": { 00:14:20.565 "read": true, 00:14:20.565 "write": true, 00:14:20.565 "unmap": true, 00:14:20.565 "write_zeroes": true, 00:14:20.565 "flush": true, 00:14:20.565 "reset": true, 00:14:20.565 "compare": false, 00:14:20.565 "compare_and_write": false, 00:14:20.565 "abort": true, 00:14:20.565 "nvme_admin": false, 00:14:20.565 "nvme_io": false 00:14:20.565 }, 00:14:20.565 "memory_domains": [ 00:14:20.565 { 00:14:20.565 "dma_device_id": "system", 00:14:20.565 "dma_device_type": 1 00:14:20.565 }, 00:14:20.565 { 00:14:20.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.565 "dma_device_type": 2 00:14:20.565 } 00:14:20.565 ], 00:14:20.565 "driver_specific": { 00:14:20.565 "passthru": { 00:14:20.565 "name": "pt1", 00:14:20.565 "base_bdev_name": "malloc1" 00:14:20.565 } 00:14:20.565 } 00:14:20.565 }' 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.565 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:20.824 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.083 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.083 "name": "pt2", 00:14:21.083 "aliases": [ 00:14:21.083 "00000000-0000-0000-0000-000000000002" 00:14:21.083 ], 00:14:21.083 "product_name": "passthru", 00:14:21.083 "block_size": 512, 00:14:21.083 "num_blocks": 65536, 00:14:21.083 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.083 "assigned_rate_limits": { 00:14:21.083 "rw_ios_per_sec": 0, 00:14:21.083 "rw_mbytes_per_sec": 0, 00:14:21.083 "r_mbytes_per_sec": 0, 00:14:21.083 "w_mbytes_per_sec": 0 00:14:21.084 }, 00:14:21.084 "claimed": true, 00:14:21.084 "claim_type": "exclusive_write", 00:14:21.084 "zoned": false, 00:14:21.084 "supported_io_types": { 00:14:21.084 "read": true, 00:14:21.084 "write": true, 00:14:21.084 "unmap": true, 00:14:21.084 "write_zeroes": true, 00:14:21.084 "flush": true, 00:14:21.084 "reset": true, 00:14:21.084 "compare": false, 00:14:21.084 "compare_and_write": false, 00:14:21.084 "abort": true, 00:14:21.084 "nvme_admin": false, 00:14:21.084 "nvme_io": false 00:14:21.084 }, 00:14:21.084 "memory_domains": [ 00:14:21.084 { 00:14:21.084 "dma_device_id": "system", 00:14:21.084 "dma_device_type": 1 00:14:21.084 }, 00:14:21.084 { 00:14:21.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.084 "dma_device_type": 2 00:14:21.084 } 00:14:21.084 ], 00:14:21.084 "driver_specific": { 00:14:21.084 "passthru": { 00:14:21.084 "name": "pt2", 00:14:21.084 "base_bdev_name": "malloc2" 00:14:21.084 } 00:14:21.084 } 00:14:21.084 }' 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.084 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.343 10:09:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:21.343 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.603 "name": "pt3", 00:14:21.603 "aliases": [ 00:14:21.603 "00000000-0000-0000-0000-000000000003" 00:14:21.603 ], 00:14:21.603 "product_name": "passthru", 00:14:21.603 "block_size": 512, 00:14:21.603 "num_blocks": 65536, 00:14:21.603 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:21.603 "assigned_rate_limits": { 00:14:21.603 "rw_ios_per_sec": 0, 00:14:21.603 "rw_mbytes_per_sec": 0, 00:14:21.603 "r_mbytes_per_sec": 0, 00:14:21.603 "w_mbytes_per_sec": 0 00:14:21.603 }, 00:14:21.603 "claimed": true, 00:14:21.603 "claim_type": "exclusive_write", 00:14:21.603 "zoned": false, 00:14:21.603 "supported_io_types": { 00:14:21.603 "read": true, 00:14:21.603 "write": true, 00:14:21.603 "unmap": true, 00:14:21.603 "write_zeroes": true, 00:14:21.603 "flush": true, 00:14:21.603 "reset": true, 00:14:21.603 "compare": false, 00:14:21.603 "compare_and_write": false, 00:14:21.603 "abort": true, 00:14:21.603 "nvme_admin": false, 00:14:21.603 "nvme_io": false 00:14:21.603 }, 00:14:21.603 "memory_domains": [ 00:14:21.603 { 00:14:21.603 "dma_device_id": "system", 00:14:21.603 "dma_device_type": 1 00:14:21.603 }, 00:14:21.603 { 00:14:21.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.603 "dma_device_type": 2 00:14:21.603 } 00:14:21.603 ], 00:14:21.603 "driver_specific": { 00:14:21.603 "passthru": { 00:14:21.603 "name": "pt3", 00:14:21.603 "base_bdev_name": "malloc3" 00:14:21.603 } 00:14:21.603 } 00:14:21.603 }' 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.603 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:21.863 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:22.123 [2024-06-10 10:09:43.804106] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:22.123 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a10fdacf-b8a9-46e1-bfe6-2431a0a0405d 00:14:22.123 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a10fdacf-b8a9-46e1-bfe6-2431a0a0405d ']' 00:14:22.123 10:09:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:22.382 [2024-06-10 10:09:43.992389] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:22.382 [2024-06-10 10:09:43.992402] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:22.382 [2024-06-10 10:09:43.992436] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:22.382 [2024-06-10 10:09:43.992475] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:22.382 [2024-06-10 10:09:43.992481] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2667830 name raid_bdev1, state offline 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:22.382 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:22.642 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:22.642 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:22.902 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:22.902 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:22.902 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:22.902 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:23.161 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:23.161 10:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:23.162 10:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.422 [2024-06-10 10:09:45.115313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:23.422 [2024-06-10 10:09:45.116403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:23.422 [2024-06-10 10:09:45.116436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:23.422 [2024-06-10 10:09:45.116470] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:23.422 [2024-06-10 10:09:45.116496] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:23.422 [2024-06-10 10:09:45.116509] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:23.422 [2024-06-10 10:09:45.116519] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:23.422 [2024-06-10 10:09:45.116524] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2662900 name raid_bdev1, state configuring 00:14:23.422 request: 00:14:23.422 { 00:14:23.422 "name": "raid_bdev1", 00:14:23.422 "raid_level": "concat", 00:14:23.422 "base_bdevs": [ 00:14:23.422 "malloc1", 00:14:23.422 "malloc2", 00:14:23.422 "malloc3" 00:14:23.422 ], 00:14:23.422 "superblock": false, 00:14:23.422 "strip_size_kb": 64, 00:14:23.422 "method": "bdev_raid_create", 00:14:23.422 "req_id": 1 00:14:23.422 } 00:14:23.422 Got JSON-RPC error response 00:14:23.422 response: 00:14:23.422 { 00:14:23.422 "code": -17, 00:14:23.422 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:23.422 } 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.422 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:23.683 [2024-06-10 10:09:45.496230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:23.683 [2024-06-10 10:09:45.496256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:23.683 [2024-06-10 10:09:45.496266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2668f50 00:14:23.683 [2024-06-10 10:09:45.496272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:23.683 [2024-06-10 10:09:45.497517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:23.683 [2024-06-10 10:09:45.497537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:23.683 [2024-06-10 10:09:45.497579] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:23.683 [2024-06-10 10:09:45.497596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:23.683 pt1 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.683 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:23.943 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.943 "name": "raid_bdev1", 00:14:23.943 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:23.943 "strip_size_kb": 64, 00:14:23.943 "state": "configuring", 00:14:23.943 "raid_level": "concat", 00:14:23.943 "superblock": true, 00:14:23.943 "num_base_bdevs": 3, 00:14:23.943 "num_base_bdevs_discovered": 1, 00:14:23.943 "num_base_bdevs_operational": 3, 00:14:23.943 "base_bdevs_list": [ 00:14:23.943 { 00:14:23.943 "name": "pt1", 00:14:23.943 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:23.943 "is_configured": true, 00:14:23.943 "data_offset": 2048, 00:14:23.943 "data_size": 63488 00:14:23.943 }, 00:14:23.943 { 00:14:23.943 "name": null, 00:14:23.943 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:23.943 "is_configured": false, 00:14:23.943 "data_offset": 2048, 00:14:23.943 "data_size": 63488 00:14:23.943 }, 00:14:23.943 { 00:14:23.943 "name": null, 00:14:23.943 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:23.943 "is_configured": false, 00:14:23.943 "data_offset": 2048, 00:14:23.943 "data_size": 63488 00:14:23.943 } 00:14:23.943 ] 00:14:23.943 }' 00:14:23.943 10:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.943 10:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.513 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:24.513 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:24.773 [2024-06-10 10:09:46.430606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:24.773 [2024-06-10 10:09:46.430632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.773 [2024-06-10 10:09:46.430644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2663810 00:14:24.773 [2024-06-10 10:09:46.430651] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.773 [2024-06-10 10:09:46.430913] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.773 [2024-06-10 10:09:46.430923] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:24.773 [2024-06-10 10:09:46.430968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:24.773 [2024-06-10 10:09:46.430980] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:24.773 pt2 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:24.773 [2024-06-10 10:09:46.619087] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.773 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:25.034 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.034 "name": "raid_bdev1", 00:14:25.034 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:25.034 "strip_size_kb": 64, 00:14:25.034 "state": "configuring", 00:14:25.034 "raid_level": "concat", 00:14:25.034 "superblock": true, 00:14:25.034 "num_base_bdevs": 3, 00:14:25.034 "num_base_bdevs_discovered": 1, 00:14:25.034 "num_base_bdevs_operational": 3, 00:14:25.034 "base_bdevs_list": [ 00:14:25.034 { 00:14:25.034 "name": "pt1", 00:14:25.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:25.034 "is_configured": true, 00:14:25.034 "data_offset": 2048, 00:14:25.034 "data_size": 63488 00:14:25.034 }, 00:14:25.034 { 00:14:25.034 "name": null, 00:14:25.034 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:25.034 "is_configured": false, 00:14:25.034 "data_offset": 2048, 00:14:25.034 "data_size": 63488 00:14:25.034 }, 00:14:25.034 { 00:14:25.034 "name": null, 00:14:25.034 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:25.034 "is_configured": false, 00:14:25.034 "data_offset": 2048, 00:14:25.034 "data_size": 63488 00:14:25.034 } 00:14:25.034 ] 00:14:25.034 }' 00:14:25.034 10:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.034 10:09:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.604 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:25.604 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:25.604 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:25.865 [2024-06-10 10:09:47.501299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:25.865 [2024-06-10 10:09:47.501326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.865 [2024-06-10 10:09:47.501336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26680b0 00:14:25.865 [2024-06-10 10:09:47.501342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.865 [2024-06-10 10:09:47.501599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.865 [2024-06-10 10:09:47.501608] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:25.865 [2024-06-10 10:09:47.501646] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:25.865 [2024-06-10 10:09:47.501662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:25.865 pt2 00:14:25.865 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:25.865 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:25.865 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:25.865 [2024-06-10 10:09:47.661702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:25.865 [2024-06-10 10:09:47.661719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.865 [2024-06-10 10:09:47.661726] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2664330 00:14:25.866 [2024-06-10 10:09:47.661732] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.866 [2024-06-10 10:09:47.661945] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.866 [2024-06-10 10:09:47.661956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:25.866 [2024-06-10 10:09:47.661986] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:25.866 [2024-06-10 10:09:47.661996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:25.866 [2024-06-10 10:09:47.662071] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2663b30 00:14:25.866 [2024-06-10 10:09:47.662076] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:25.866 [2024-06-10 10:09:47.662206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24badf0 00:14:25.866 [2024-06-10 10:09:47.662300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2663b30 00:14:25.866 [2024-06-10 10:09:47.662305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2663b30 00:14:25.866 [2024-06-10 10:09:47.662374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.866 pt3 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.866 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.127 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.127 "name": "raid_bdev1", 00:14:26.127 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:26.127 "strip_size_kb": 64, 00:14:26.127 "state": "online", 00:14:26.127 "raid_level": "concat", 00:14:26.127 "superblock": true, 00:14:26.127 "num_base_bdevs": 3, 00:14:26.127 "num_base_bdevs_discovered": 3, 00:14:26.127 "num_base_bdevs_operational": 3, 00:14:26.127 "base_bdevs_list": [ 00:14:26.127 { 00:14:26.127 "name": "pt1", 00:14:26.127 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:26.127 "is_configured": true, 00:14:26.127 "data_offset": 2048, 00:14:26.127 "data_size": 63488 00:14:26.127 }, 00:14:26.127 { 00:14:26.127 "name": "pt2", 00:14:26.127 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:26.127 "is_configured": true, 00:14:26.127 "data_offset": 2048, 00:14:26.127 "data_size": 63488 00:14:26.127 }, 00:14:26.127 { 00:14:26.127 "name": "pt3", 00:14:26.127 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:26.127 "is_configured": true, 00:14:26.127 "data_offset": 2048, 00:14:26.127 "data_size": 63488 00:14:26.127 } 00:14:26.127 ] 00:14:26.127 }' 00:14:26.127 10:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.127 10:09:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:26.699 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:26.961 [2024-06-10 10:09:48.572215] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:26.961 "name": "raid_bdev1", 00:14:26.961 "aliases": [ 00:14:26.961 "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d" 00:14:26.961 ], 00:14:26.961 "product_name": "Raid Volume", 00:14:26.961 "block_size": 512, 00:14:26.961 "num_blocks": 190464, 00:14:26.961 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:26.961 "assigned_rate_limits": { 00:14:26.961 "rw_ios_per_sec": 0, 00:14:26.961 "rw_mbytes_per_sec": 0, 00:14:26.961 "r_mbytes_per_sec": 0, 00:14:26.961 "w_mbytes_per_sec": 0 00:14:26.961 }, 00:14:26.961 "claimed": false, 00:14:26.961 "zoned": false, 00:14:26.961 "supported_io_types": { 00:14:26.961 "read": true, 00:14:26.961 "write": true, 00:14:26.961 "unmap": true, 00:14:26.961 "write_zeroes": true, 00:14:26.961 "flush": true, 00:14:26.961 "reset": true, 00:14:26.961 "compare": false, 00:14:26.961 "compare_and_write": false, 00:14:26.961 "abort": false, 00:14:26.961 "nvme_admin": false, 00:14:26.961 "nvme_io": false 00:14:26.961 }, 00:14:26.961 "memory_domains": [ 00:14:26.961 { 00:14:26.961 "dma_device_id": "system", 00:14:26.961 "dma_device_type": 1 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.961 "dma_device_type": 2 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "dma_device_id": "system", 00:14:26.961 "dma_device_type": 1 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.961 "dma_device_type": 2 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "dma_device_id": "system", 00:14:26.961 "dma_device_type": 1 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.961 "dma_device_type": 2 00:14:26.961 } 00:14:26.961 ], 00:14:26.961 "driver_specific": { 00:14:26.961 "raid": { 00:14:26.961 "uuid": "a10fdacf-b8a9-46e1-bfe6-2431a0a0405d", 00:14:26.961 "strip_size_kb": 64, 00:14:26.961 "state": "online", 00:14:26.961 "raid_level": "concat", 00:14:26.961 "superblock": true, 00:14:26.961 "num_base_bdevs": 3, 00:14:26.961 "num_base_bdevs_discovered": 3, 00:14:26.961 "num_base_bdevs_operational": 3, 00:14:26.961 "base_bdevs_list": [ 00:14:26.961 { 00:14:26.961 "name": "pt1", 00:14:26.961 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:26.961 "is_configured": true, 00:14:26.961 "data_offset": 2048, 00:14:26.961 "data_size": 63488 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "name": "pt2", 00:14:26.961 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:26.961 "is_configured": true, 00:14:26.961 "data_offset": 2048, 00:14:26.961 "data_size": 63488 00:14:26.961 }, 00:14:26.961 { 00:14:26.961 "name": "pt3", 00:14:26.961 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:26.961 "is_configured": true, 00:14:26.961 "data_offset": 2048, 00:14:26.961 "data_size": 63488 00:14:26.961 } 00:14:26.961 ] 00:14:26.961 } 00:14:26.961 } 00:14:26.961 }' 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:26.961 pt2 00:14:26.961 pt3' 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:26.961 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.222 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.222 "name": "pt1", 00:14:27.222 "aliases": [ 00:14:27.222 "00000000-0000-0000-0000-000000000001" 00:14:27.222 ], 00:14:27.222 "product_name": "passthru", 00:14:27.222 "block_size": 512, 00:14:27.222 "num_blocks": 65536, 00:14:27.222 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:27.222 "assigned_rate_limits": { 00:14:27.222 "rw_ios_per_sec": 0, 00:14:27.222 "rw_mbytes_per_sec": 0, 00:14:27.222 "r_mbytes_per_sec": 0, 00:14:27.222 "w_mbytes_per_sec": 0 00:14:27.222 }, 00:14:27.222 "claimed": true, 00:14:27.222 "claim_type": "exclusive_write", 00:14:27.222 "zoned": false, 00:14:27.222 "supported_io_types": { 00:14:27.222 "read": true, 00:14:27.222 "write": true, 00:14:27.222 "unmap": true, 00:14:27.222 "write_zeroes": true, 00:14:27.222 "flush": true, 00:14:27.222 "reset": true, 00:14:27.222 "compare": false, 00:14:27.222 "compare_and_write": false, 00:14:27.222 "abort": true, 00:14:27.222 "nvme_admin": false, 00:14:27.222 "nvme_io": false 00:14:27.222 }, 00:14:27.222 "memory_domains": [ 00:14:27.222 { 00:14:27.222 "dma_device_id": "system", 00:14:27.222 "dma_device_type": 1 00:14:27.222 }, 00:14:27.222 { 00:14:27.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.222 "dma_device_type": 2 00:14:27.222 } 00:14:27.222 ], 00:14:27.222 "driver_specific": { 00:14:27.222 "passthru": { 00:14:27.223 "name": "pt1", 00:14:27.223 "base_bdev_name": "malloc1" 00:14:27.223 } 00:14:27.223 } 00:14:27.223 }' 00:14:27.223 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.223 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.223 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.223 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.223 10:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.223 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.223 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.223 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:27.484 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.745 "name": "pt2", 00:14:27.745 "aliases": [ 00:14:27.745 "00000000-0000-0000-0000-000000000002" 00:14:27.745 ], 00:14:27.745 "product_name": "passthru", 00:14:27.745 "block_size": 512, 00:14:27.745 "num_blocks": 65536, 00:14:27.745 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:27.745 "assigned_rate_limits": { 00:14:27.745 "rw_ios_per_sec": 0, 00:14:27.745 "rw_mbytes_per_sec": 0, 00:14:27.745 "r_mbytes_per_sec": 0, 00:14:27.745 "w_mbytes_per_sec": 0 00:14:27.745 }, 00:14:27.745 "claimed": true, 00:14:27.745 "claim_type": "exclusive_write", 00:14:27.745 "zoned": false, 00:14:27.745 "supported_io_types": { 00:14:27.745 "read": true, 00:14:27.745 "write": true, 00:14:27.745 "unmap": true, 00:14:27.745 "write_zeroes": true, 00:14:27.745 "flush": true, 00:14:27.745 "reset": true, 00:14:27.745 "compare": false, 00:14:27.745 "compare_and_write": false, 00:14:27.745 "abort": true, 00:14:27.745 "nvme_admin": false, 00:14:27.745 "nvme_io": false 00:14:27.745 }, 00:14:27.745 "memory_domains": [ 00:14:27.745 { 00:14:27.745 "dma_device_id": "system", 00:14:27.745 "dma_device_type": 1 00:14:27.745 }, 00:14:27.745 { 00:14:27.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.745 "dma_device_type": 2 00:14:27.745 } 00:14:27.745 ], 00:14:27.745 "driver_specific": { 00:14:27.745 "passthru": { 00:14:27.745 "name": "pt2", 00:14:27.745 "base_bdev_name": "malloc2" 00:14:27.745 } 00:14:27.745 } 00:14:27.745 }' 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.745 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.006 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.006 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.007 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.007 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.007 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:28.007 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:28.007 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:28.268 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:28.268 "name": "pt3", 00:14:28.268 "aliases": [ 00:14:28.268 "00000000-0000-0000-0000-000000000003" 00:14:28.268 ], 00:14:28.268 "product_name": "passthru", 00:14:28.268 "block_size": 512, 00:14:28.268 "num_blocks": 65536, 00:14:28.268 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:28.268 "assigned_rate_limits": { 00:14:28.268 "rw_ios_per_sec": 0, 00:14:28.268 "rw_mbytes_per_sec": 0, 00:14:28.268 "r_mbytes_per_sec": 0, 00:14:28.268 "w_mbytes_per_sec": 0 00:14:28.268 }, 00:14:28.268 "claimed": true, 00:14:28.268 "claim_type": "exclusive_write", 00:14:28.268 "zoned": false, 00:14:28.268 "supported_io_types": { 00:14:28.268 "read": true, 00:14:28.268 "write": true, 00:14:28.268 "unmap": true, 00:14:28.268 "write_zeroes": true, 00:14:28.268 "flush": true, 00:14:28.268 "reset": true, 00:14:28.268 "compare": false, 00:14:28.268 "compare_and_write": false, 00:14:28.268 "abort": true, 00:14:28.268 "nvme_admin": false, 00:14:28.268 "nvme_io": false 00:14:28.268 }, 00:14:28.268 "memory_domains": [ 00:14:28.268 { 00:14:28.268 "dma_device_id": "system", 00:14:28.268 "dma_device_type": 1 00:14:28.268 }, 00:14:28.268 { 00:14:28.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.268 "dma_device_type": 2 00:14:28.268 } 00:14:28.268 ], 00:14:28.268 "driver_specific": { 00:14:28.268 "passthru": { 00:14:28.268 "name": "pt3", 00:14:28.268 "base_bdev_name": "malloc3" 00:14:28.268 } 00:14:28.268 } 00:14:28.268 }' 00:14:28.268 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.268 10:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.268 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:28.529 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:28.790 [2024-06-10 10:09:50.404872] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a10fdacf-b8a9-46e1-bfe6-2431a0a0405d '!=' a10fdacf-b8a9-46e1-bfe6-2431a0a0405d ']' 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1001976 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1001976 ']' 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1001976 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1001976 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1001976' 00:14:28.790 killing process with pid 1001976 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1001976 00:14:28.790 [2024-06-10 10:09:50.472796] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:28.790 [2024-06-10 10:09:50.472844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:28.790 [2024-06-10 10:09:50.472890] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:28.790 [2024-06-10 10:09:50.472896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2663b30 name raid_bdev1, state offline 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1001976 00:14:28.790 [2024-06-10 10:09:50.487687] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:28.790 00:14:28.790 real 0m11.798s 00:14:28.790 user 0m21.739s 00:14:28.790 sys 0m1.733s 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:28.790 10:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.790 ************************************ 00:14:28.790 END TEST raid_superblock_test 00:14:28.790 ************************************ 00:14:28.790 10:09:50 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:14:28.790 10:09:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:28.790 10:09:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:28.790 10:09:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:29.052 ************************************ 00:14:29.052 START TEST raid_read_error_test 00:14:29.052 ************************************ 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 read 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.csgbbQU7l9 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1004387 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1004387 /var/tmp/spdk-raid.sock 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1004387 ']' 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:29.052 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:29.052 10:09:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.052 [2024-06-10 10:09:50.749355] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:14:29.052 [2024-06-10 10:09:50.749400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1004387 ] 00:14:29.052 [2024-06-10 10:09:50.835031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:29.052 [2024-06-10 10:09:50.899611] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:29.313 [2024-06-10 10:09:50.940007] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:29.313 [2024-06-10 10:09:50.940030] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:29.883 10:09:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:29.883 10:09:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:29.883 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:29.883 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:29.883 BaseBdev1_malloc 00:14:29.883 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:30.145 true 00:14:30.145 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:30.145 [2024-06-10 10:09:51.977899] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:30.145 [2024-06-10 10:09:51.977933] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.145 [2024-06-10 10:09:51.977943] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c7ad10 00:14:30.145 [2024-06-10 10:09:51.977949] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.145 [2024-06-10 10:09:51.979273] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.145 [2024-06-10 10:09:51.979292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:30.145 BaseBdev1 00:14:30.145 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:30.145 10:09:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:30.408 BaseBdev2_malloc 00:14:30.409 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:30.671 true 00:14:30.671 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:30.671 [2024-06-10 10:09:52.424871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:30.671 [2024-06-10 10:09:52.424898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.671 [2024-06-10 10:09:52.424908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c7f710 00:14:30.671 [2024-06-10 10:09:52.424914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.671 [2024-06-10 10:09:52.426063] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.671 [2024-06-10 10:09:52.426081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:30.671 BaseBdev2 00:14:30.671 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:30.671 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:30.932 BaseBdev3_malloc 00:14:30.932 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:30.932 true 00:14:30.932 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:31.192 [2024-06-10 10:09:52.919806] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:31.192 [2024-06-10 10:09:52.919834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.192 [2024-06-10 10:09:52.919843] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c80340 00:14:31.192 [2024-06-10 10:09:52.919849] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.192 [2024-06-10 10:09:52.920980] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.192 [2024-06-10 10:09:52.920997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:31.192 BaseBdev3 00:14:31.193 10:09:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:31.453 [2024-06-10 10:09:53.064197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:31.453 [2024-06-10 10:09:53.065166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:31.453 [2024-06-10 10:09:53.065218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:31.453 [2024-06-10 10:09:53.065373] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c83160 00:14:31.453 [2024-06-10 10:09:53.065380] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:31.453 [2024-06-10 10:09:53.065511] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c7c720 00:14:31.453 [2024-06-10 10:09:53.065624] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c83160 00:14:31.453 [2024-06-10 10:09:53.065633] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c83160 00:14:31.453 [2024-06-10 10:09:53.065706] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.453 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:31.453 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:31.453 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.454 "name": "raid_bdev1", 00:14:31.454 "uuid": "8561a6f4-fc60-496e-b023-372a9e983fce", 00:14:31.454 "strip_size_kb": 64, 00:14:31.454 "state": "online", 00:14:31.454 "raid_level": "concat", 00:14:31.454 "superblock": true, 00:14:31.454 "num_base_bdevs": 3, 00:14:31.454 "num_base_bdevs_discovered": 3, 00:14:31.454 "num_base_bdevs_operational": 3, 00:14:31.454 "base_bdevs_list": [ 00:14:31.454 { 00:14:31.454 "name": "BaseBdev1", 00:14:31.454 "uuid": "e1e4f29d-2cc0-5958-bf33-9e99edbe9796", 00:14:31.454 "is_configured": true, 00:14:31.454 "data_offset": 2048, 00:14:31.454 "data_size": 63488 00:14:31.454 }, 00:14:31.454 { 00:14:31.454 "name": "BaseBdev2", 00:14:31.454 "uuid": "cb7e227a-1161-5376-a9a1-9354b882079d", 00:14:31.454 "is_configured": true, 00:14:31.454 "data_offset": 2048, 00:14:31.454 "data_size": 63488 00:14:31.454 }, 00:14:31.454 { 00:14:31.454 "name": "BaseBdev3", 00:14:31.454 "uuid": "28ef203a-5121-5d74-badb-2465e88183bf", 00:14:31.454 "is_configured": true, 00:14:31.454 "data_offset": 2048, 00:14:31.454 "data_size": 63488 00:14:31.454 } 00:14:31.454 ] 00:14:31.454 }' 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.454 10:09:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.025 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:32.025 10:09:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:32.025 [2024-06-10 10:09:53.858385] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d7d40 00:14:32.968 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.279 10:09:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:33.540 10:09:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.540 "name": "raid_bdev1", 00:14:33.540 "uuid": "8561a6f4-fc60-496e-b023-372a9e983fce", 00:14:33.540 "strip_size_kb": 64, 00:14:33.540 "state": "online", 00:14:33.540 "raid_level": "concat", 00:14:33.540 "superblock": true, 00:14:33.540 "num_base_bdevs": 3, 00:14:33.540 "num_base_bdevs_discovered": 3, 00:14:33.540 "num_base_bdevs_operational": 3, 00:14:33.540 "base_bdevs_list": [ 00:14:33.540 { 00:14:33.540 "name": "BaseBdev1", 00:14:33.540 "uuid": "e1e4f29d-2cc0-5958-bf33-9e99edbe9796", 00:14:33.540 "is_configured": true, 00:14:33.540 "data_offset": 2048, 00:14:33.540 "data_size": 63488 00:14:33.540 }, 00:14:33.540 { 00:14:33.540 "name": "BaseBdev2", 00:14:33.540 "uuid": "cb7e227a-1161-5376-a9a1-9354b882079d", 00:14:33.540 "is_configured": true, 00:14:33.540 "data_offset": 2048, 00:14:33.540 "data_size": 63488 00:14:33.540 }, 00:14:33.540 { 00:14:33.540 "name": "BaseBdev3", 00:14:33.540 "uuid": "28ef203a-5121-5d74-badb-2465e88183bf", 00:14:33.540 "is_configured": true, 00:14:33.540 "data_offset": 2048, 00:14:33.540 "data_size": 63488 00:14:33.540 } 00:14:33.540 ] 00:14:33.540 }' 00:14:33.540 10:09:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.540 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.801 10:09:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:34.062 [2024-06-10 10:09:55.817575] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:34.062 [2024-06-10 10:09:55.817604] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:34.062 [2024-06-10 10:09:55.820194] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:34.062 [2024-06-10 10:09:55.820220] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:34.062 [2024-06-10 10:09:55.820246] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:34.062 [2024-06-10 10:09:55.820252] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c83160 name raid_bdev1, state offline 00:14:34.062 0 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1004387 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1004387 ']' 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1004387 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1004387 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1004387' 00:14:34.062 killing process with pid 1004387 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1004387 00:14:34.062 [2024-06-10 10:09:55.887813] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:34.062 10:09:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1004387 00:14:34.062 [2024-06-10 10:09:55.898990] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.csgbbQU7l9 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:14:34.323 00:14:34.323 real 0m5.352s 00:14:34.323 user 0m8.409s 00:14:34.323 sys 0m0.734s 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:34.323 10:09:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.323 ************************************ 00:14:34.323 END TEST raid_read_error_test 00:14:34.323 ************************************ 00:14:34.323 10:09:56 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:14:34.323 10:09:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:34.323 10:09:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:34.323 10:09:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:34.323 ************************************ 00:14:34.323 START TEST raid_write_error_test 00:14:34.323 ************************************ 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 write 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:34.323 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VRgE5Ct67L 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1005343 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1005343 /var/tmp/spdk-raid.sock 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1005343 ']' 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:34.324 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:34.324 10:09:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.324 [2024-06-10 10:09:56.169808] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:14:34.324 [2024-06-10 10:09:56.169858] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005343 ] 00:14:34.584 [2024-06-10 10:09:56.255667] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:34.584 [2024-06-10 10:09:56.317978] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:34.584 [2024-06-10 10:09:56.362174] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.584 [2024-06-10 10:09:56.362199] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:35.157 10:09:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:35.157 10:09:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:35.157 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:35.157 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:35.417 BaseBdev1_malloc 00:14:35.417 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:35.678 true 00:14:35.678 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:35.978 [2024-06-10 10:09:57.544549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:35.978 [2024-06-10 10:09:57.544580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.978 [2024-06-10 10:09:57.544591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169bd10 00:14:35.978 [2024-06-10 10:09:57.544598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.978 [2024-06-10 10:09:57.545916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.978 [2024-06-10 10:09:57.545935] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:35.978 BaseBdev1 00:14:35.978 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:35.978 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:35.978 BaseBdev2_malloc 00:14:35.978 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:36.256 true 00:14:36.256 10:09:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:36.256 [2024-06-10 10:09:58.111856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:36.256 [2024-06-10 10:09:58.111884] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.256 [2024-06-10 10:09:58.111895] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a0710 00:14:36.256 [2024-06-10 10:09:58.111901] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.256 [2024-06-10 10:09:58.113048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.256 [2024-06-10 10:09:58.113067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:36.256 BaseBdev2 00:14:36.517 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:36.517 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:36.517 BaseBdev3_malloc 00:14:36.517 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:36.778 true 00:14:36.778 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:37.039 [2024-06-10 10:09:58.662868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:37.039 [2024-06-10 10:09:58.662892] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:37.039 [2024-06-10 10:09:58.662901] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a1340 00:14:37.039 [2024-06-10 10:09:58.662907] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:37.039 [2024-06-10 10:09:58.664040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:37.039 [2024-06-10 10:09:58.664058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:37.039 BaseBdev3 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:37.039 [2024-06-10 10:09:58.843349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:37.039 [2024-06-10 10:09:58.844319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.039 [2024-06-10 10:09:58.844372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:37.039 [2024-06-10 10:09:58.844528] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16a4160 00:14:37.039 [2024-06-10 10:09:58.844536] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:37.039 [2024-06-10 10:09:58.844669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169d720 00:14:37.039 [2024-06-10 10:09:58.844782] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16a4160 00:14:37.039 [2024-06-10 10:09:58.844787] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16a4160 00:14:37.039 [2024-06-10 10:09:58.844866] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.039 10:09:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:37.301 10:09:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.301 "name": "raid_bdev1", 00:14:37.301 "uuid": "774eaf1b-a2ed-4a06-8c78-8de25164fa0f", 00:14:37.301 "strip_size_kb": 64, 00:14:37.301 "state": "online", 00:14:37.301 "raid_level": "concat", 00:14:37.301 "superblock": true, 00:14:37.301 "num_base_bdevs": 3, 00:14:37.301 "num_base_bdevs_discovered": 3, 00:14:37.301 "num_base_bdevs_operational": 3, 00:14:37.301 "base_bdevs_list": [ 00:14:37.301 { 00:14:37.301 "name": "BaseBdev1", 00:14:37.301 "uuid": "5b8089c7-2c5b-53b6-bf91-3f0bf40cf96c", 00:14:37.301 "is_configured": true, 00:14:37.301 "data_offset": 2048, 00:14:37.301 "data_size": 63488 00:14:37.301 }, 00:14:37.301 { 00:14:37.301 "name": "BaseBdev2", 00:14:37.301 "uuid": "f4e21b1b-7d39-5cfc-97c1-565eebdf3e95", 00:14:37.301 "is_configured": true, 00:14:37.301 "data_offset": 2048, 00:14:37.301 "data_size": 63488 00:14:37.301 }, 00:14:37.301 { 00:14:37.301 "name": "BaseBdev3", 00:14:37.301 "uuid": "33746892-70ff-57cf-a660-9ae68998bbcd", 00:14:37.301 "is_configured": true, 00:14:37.301 "data_offset": 2048, 00:14:37.301 "data_size": 63488 00:14:37.301 } 00:14:37.301 ] 00:14:37.301 }' 00:14:37.301 10:09:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.301 10:09:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.872 10:09:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:37.872 10:09:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:37.872 [2024-06-10 10:09:59.689674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f8d40 00:14:38.814 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:39.075 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.076 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:39.337 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.337 "name": "raid_bdev1", 00:14:39.337 "uuid": "774eaf1b-a2ed-4a06-8c78-8de25164fa0f", 00:14:39.337 "strip_size_kb": 64, 00:14:39.337 "state": "online", 00:14:39.337 "raid_level": "concat", 00:14:39.337 "superblock": true, 00:14:39.337 "num_base_bdevs": 3, 00:14:39.337 "num_base_bdevs_discovered": 3, 00:14:39.337 "num_base_bdevs_operational": 3, 00:14:39.337 "base_bdevs_list": [ 00:14:39.337 { 00:14:39.337 "name": "BaseBdev1", 00:14:39.337 "uuid": "5b8089c7-2c5b-53b6-bf91-3f0bf40cf96c", 00:14:39.337 "is_configured": true, 00:14:39.337 "data_offset": 2048, 00:14:39.337 "data_size": 63488 00:14:39.337 }, 00:14:39.337 { 00:14:39.337 "name": "BaseBdev2", 00:14:39.337 "uuid": "f4e21b1b-7d39-5cfc-97c1-565eebdf3e95", 00:14:39.337 "is_configured": true, 00:14:39.337 "data_offset": 2048, 00:14:39.337 "data_size": 63488 00:14:39.337 }, 00:14:39.337 { 00:14:39.337 "name": "BaseBdev3", 00:14:39.337 "uuid": "33746892-70ff-57cf-a660-9ae68998bbcd", 00:14:39.337 "is_configured": true, 00:14:39.337 "data_offset": 2048, 00:14:39.337 "data_size": 63488 00:14:39.337 } 00:14:39.337 ] 00:14:39.337 }' 00:14:39.337 10:10:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.337 10:10:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:39.909 [2024-06-10 10:10:01.700926] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:39.909 [2024-06-10 10:10:01.700958] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:39.909 [2024-06-10 10:10:01.703548] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:39.909 [2024-06-10 10:10:01.703574] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.909 [2024-06-10 10:10:01.703600] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:39.909 [2024-06-10 10:10:01.703606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16a4160 name raid_bdev1, state offline 00:14:39.909 0 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1005343 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1005343 ']' 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1005343 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:39.909 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1005343 00:14:40.170 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1005343' 00:14:40.171 killing process with pid 1005343 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1005343 00:14:40.171 [2024-06-10 10:10:01.784077] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1005343 00:14:40.171 [2024-06-10 10:10:01.795303] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VRgE5Ct67L 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:14:40.171 00:14:40.171 real 0m5.826s 00:14:40.171 user 0m9.283s 00:14:40.171 sys 0m0.799s 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:40.171 10:10:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.171 ************************************ 00:14:40.171 END TEST raid_write_error_test 00:14:40.171 ************************************ 00:14:40.171 10:10:01 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:40.171 10:10:01 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:14:40.171 10:10:01 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:40.171 10:10:01 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:40.171 10:10:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:40.171 ************************************ 00:14:40.171 START TEST raid_state_function_test 00:14:40.171 ************************************ 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 false 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1006445 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1006445' 00:14:40.171 Process raid pid: 1006445 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1006445 /var/tmp/spdk-raid.sock 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1006445 ']' 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:40.171 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:40.171 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.430 [2024-06-10 10:10:02.075711] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:14:40.430 [2024-06-10 10:10:02.075756] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:40.430 [2024-06-10 10:10:02.164483] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:40.430 [2024-06-10 10:10:02.228566] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:40.430 [2024-06-10 10:10:02.266861] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:40.430 [2024-06-10 10:10:02.266882] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:41.369 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:41.369 10:10:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:14:41.369 10:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:41.369 [2024-06-10 10:10:03.077820] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:41.369 [2024-06-10 10:10:03.077854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:41.369 [2024-06-10 10:10:03.077860] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:41.369 [2024-06-10 10:10:03.077866] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:41.369 [2024-06-10 10:10:03.077871] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:41.369 [2024-06-10 10:10:03.077876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.369 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.629 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.629 "name": "Existed_Raid", 00:14:41.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.629 "strip_size_kb": 0, 00:14:41.629 "state": "configuring", 00:14:41.629 "raid_level": "raid1", 00:14:41.629 "superblock": false, 00:14:41.629 "num_base_bdevs": 3, 00:14:41.629 "num_base_bdevs_discovered": 0, 00:14:41.629 "num_base_bdevs_operational": 3, 00:14:41.629 "base_bdevs_list": [ 00:14:41.629 { 00:14:41.629 "name": "BaseBdev1", 00:14:41.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.629 "is_configured": false, 00:14:41.629 "data_offset": 0, 00:14:41.629 "data_size": 0 00:14:41.629 }, 00:14:41.629 { 00:14:41.629 "name": "BaseBdev2", 00:14:41.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.629 "is_configured": false, 00:14:41.629 "data_offset": 0, 00:14:41.629 "data_size": 0 00:14:41.629 }, 00:14:41.629 { 00:14:41.629 "name": "BaseBdev3", 00:14:41.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.629 "is_configured": false, 00:14:41.629 "data_offset": 0, 00:14:41.629 "data_size": 0 00:14:41.629 } 00:14:41.629 ] 00:14:41.629 }' 00:14:41.629 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.629 10:10:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.200 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:42.200 [2024-06-10 10:10:03.976000] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:42.200 [2024-06-10 10:10:03.976015] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x160bb00 name Existed_Raid, state configuring 00:14:42.200 10:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:42.460 [2024-06-10 10:10:04.168497] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:42.460 [2024-06-10 10:10:04.168516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:42.460 [2024-06-10 10:10:04.168520] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:42.460 [2024-06-10 10:10:04.168526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:42.460 [2024-06-10 10:10:04.168531] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:42.460 [2024-06-10 10:10:04.168536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:42.460 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:42.722 [2024-06-10 10:10:04.355507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:42.722 BaseBdev1 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.722 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:42.983 [ 00:14:42.983 { 00:14:42.983 "name": "BaseBdev1", 00:14:42.983 "aliases": [ 00:14:42.983 "50953d9b-5f29-470e-bddb-555e14cf4f0e" 00:14:42.983 ], 00:14:42.983 "product_name": "Malloc disk", 00:14:42.983 "block_size": 512, 00:14:42.983 "num_blocks": 65536, 00:14:42.983 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:42.983 "assigned_rate_limits": { 00:14:42.983 "rw_ios_per_sec": 0, 00:14:42.983 "rw_mbytes_per_sec": 0, 00:14:42.983 "r_mbytes_per_sec": 0, 00:14:42.983 "w_mbytes_per_sec": 0 00:14:42.983 }, 00:14:42.983 "claimed": true, 00:14:42.984 "claim_type": "exclusive_write", 00:14:42.984 "zoned": false, 00:14:42.984 "supported_io_types": { 00:14:42.984 "read": true, 00:14:42.984 "write": true, 00:14:42.984 "unmap": true, 00:14:42.984 "write_zeroes": true, 00:14:42.984 "flush": true, 00:14:42.984 "reset": true, 00:14:42.984 "compare": false, 00:14:42.984 "compare_and_write": false, 00:14:42.984 "abort": true, 00:14:42.984 "nvme_admin": false, 00:14:42.984 "nvme_io": false 00:14:42.984 }, 00:14:42.984 "memory_domains": [ 00:14:42.984 { 00:14:42.984 "dma_device_id": "system", 00:14:42.984 "dma_device_type": 1 00:14:42.984 }, 00:14:42.984 { 00:14:42.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.984 "dma_device_type": 2 00:14:42.984 } 00:14:42.984 ], 00:14:42.984 "driver_specific": {} 00:14:42.984 } 00:14:42.984 ] 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.984 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.244 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.244 "name": "Existed_Raid", 00:14:43.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.244 "strip_size_kb": 0, 00:14:43.244 "state": "configuring", 00:14:43.244 "raid_level": "raid1", 00:14:43.244 "superblock": false, 00:14:43.244 "num_base_bdevs": 3, 00:14:43.244 "num_base_bdevs_discovered": 1, 00:14:43.244 "num_base_bdevs_operational": 3, 00:14:43.244 "base_bdevs_list": [ 00:14:43.244 { 00:14:43.244 "name": "BaseBdev1", 00:14:43.244 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:43.244 "is_configured": true, 00:14:43.244 "data_offset": 0, 00:14:43.244 "data_size": 65536 00:14:43.244 }, 00:14:43.244 { 00:14:43.244 "name": "BaseBdev2", 00:14:43.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.244 "is_configured": false, 00:14:43.244 "data_offset": 0, 00:14:43.244 "data_size": 0 00:14:43.244 }, 00:14:43.244 { 00:14:43.244 "name": "BaseBdev3", 00:14:43.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.244 "is_configured": false, 00:14:43.244 "data_offset": 0, 00:14:43.244 "data_size": 0 00:14:43.244 } 00:14:43.244 ] 00:14:43.244 }' 00:14:43.244 10:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.244 10:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.816 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:43.816 [2024-06-10 10:10:05.562547] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:43.816 [2024-06-10 10:10:05.562571] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x160b3f0 name Existed_Raid, state configuring 00:14:43.816 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:44.077 [2024-06-10 10:10:05.739014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:44.077 [2024-06-10 10:10:05.740165] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:44.077 [2024-06-10 10:10:05.740188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:44.077 [2024-06-10 10:10:05.740194] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:44.077 [2024-06-10 10:10:05.740199] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.077 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.338 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.338 "name": "Existed_Raid", 00:14:44.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.338 "strip_size_kb": 0, 00:14:44.338 "state": "configuring", 00:14:44.338 "raid_level": "raid1", 00:14:44.338 "superblock": false, 00:14:44.338 "num_base_bdevs": 3, 00:14:44.338 "num_base_bdevs_discovered": 1, 00:14:44.338 "num_base_bdevs_operational": 3, 00:14:44.338 "base_bdevs_list": [ 00:14:44.338 { 00:14:44.338 "name": "BaseBdev1", 00:14:44.338 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:44.338 "is_configured": true, 00:14:44.338 "data_offset": 0, 00:14:44.338 "data_size": 65536 00:14:44.338 }, 00:14:44.338 { 00:14:44.338 "name": "BaseBdev2", 00:14:44.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.338 "is_configured": false, 00:14:44.338 "data_offset": 0, 00:14:44.338 "data_size": 0 00:14:44.338 }, 00:14:44.338 { 00:14:44.338 "name": "BaseBdev3", 00:14:44.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.338 "is_configured": false, 00:14:44.338 "data_offset": 0, 00:14:44.338 "data_size": 0 00:14:44.338 } 00:14:44.338 ] 00:14:44.338 }' 00:14:44.338 10:10:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.338 10:10:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.599 10:10:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:44.860 [2024-06-10 10:10:06.642225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:44.860 BaseBdev2 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:44.860 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.121 10:10:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:45.382 [ 00:14:45.382 { 00:14:45.382 "name": "BaseBdev2", 00:14:45.382 "aliases": [ 00:14:45.382 "be9eb103-04d6-4949-a6ae-6b3347076552" 00:14:45.382 ], 00:14:45.382 "product_name": "Malloc disk", 00:14:45.382 "block_size": 512, 00:14:45.382 "num_blocks": 65536, 00:14:45.382 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:45.382 "assigned_rate_limits": { 00:14:45.382 "rw_ios_per_sec": 0, 00:14:45.382 "rw_mbytes_per_sec": 0, 00:14:45.382 "r_mbytes_per_sec": 0, 00:14:45.382 "w_mbytes_per_sec": 0 00:14:45.382 }, 00:14:45.382 "claimed": true, 00:14:45.383 "claim_type": "exclusive_write", 00:14:45.383 "zoned": false, 00:14:45.383 "supported_io_types": { 00:14:45.383 "read": true, 00:14:45.383 "write": true, 00:14:45.383 "unmap": true, 00:14:45.383 "write_zeroes": true, 00:14:45.383 "flush": true, 00:14:45.383 "reset": true, 00:14:45.383 "compare": false, 00:14:45.383 "compare_and_write": false, 00:14:45.383 "abort": true, 00:14:45.383 "nvme_admin": false, 00:14:45.383 "nvme_io": false 00:14:45.383 }, 00:14:45.383 "memory_domains": [ 00:14:45.383 { 00:14:45.383 "dma_device_id": "system", 00:14:45.383 "dma_device_type": 1 00:14:45.383 }, 00:14:45.383 { 00:14:45.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.383 "dma_device_type": 2 00:14:45.383 } 00:14:45.383 ], 00:14:45.383 "driver_specific": {} 00:14:45.383 } 00:14:45.383 ] 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.383 "name": "Existed_Raid", 00:14:45.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.383 "strip_size_kb": 0, 00:14:45.383 "state": "configuring", 00:14:45.383 "raid_level": "raid1", 00:14:45.383 "superblock": false, 00:14:45.383 "num_base_bdevs": 3, 00:14:45.383 "num_base_bdevs_discovered": 2, 00:14:45.383 "num_base_bdevs_operational": 3, 00:14:45.383 "base_bdevs_list": [ 00:14:45.383 { 00:14:45.383 "name": "BaseBdev1", 00:14:45.383 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:45.383 "is_configured": true, 00:14:45.383 "data_offset": 0, 00:14:45.383 "data_size": 65536 00:14:45.383 }, 00:14:45.383 { 00:14:45.383 "name": "BaseBdev2", 00:14:45.383 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:45.383 "is_configured": true, 00:14:45.383 "data_offset": 0, 00:14:45.383 "data_size": 65536 00:14:45.383 }, 00:14:45.383 { 00:14:45.383 "name": "BaseBdev3", 00:14:45.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.383 "is_configured": false, 00:14:45.383 "data_offset": 0, 00:14:45.383 "data_size": 0 00:14:45.383 } 00:14:45.383 ] 00:14:45.383 }' 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.383 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.955 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:46.217 [2024-06-10 10:10:07.918547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.217 [2024-06-10 10:10:07.918570] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x160c2c0 00:14:46.217 [2024-06-10 10:10:07.918574] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:46.217 [2024-06-10 10:10:07.918715] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b23a0 00:14:46.217 [2024-06-10 10:10:07.918811] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x160c2c0 00:14:46.217 [2024-06-10 10:10:07.918820] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x160c2c0 00:14:46.217 [2024-06-10 10:10:07.918946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:46.217 BaseBdev3 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:46.217 10:10:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:46.478 10:10:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:46.478 [ 00:14:46.478 { 00:14:46.478 "name": "BaseBdev3", 00:14:46.478 "aliases": [ 00:14:46.478 "379053b8-55af-43c4-b62d-6d2dffb349aa" 00:14:46.478 ], 00:14:46.478 "product_name": "Malloc disk", 00:14:46.479 "block_size": 512, 00:14:46.479 "num_blocks": 65536, 00:14:46.479 "uuid": "379053b8-55af-43c4-b62d-6d2dffb349aa", 00:14:46.479 "assigned_rate_limits": { 00:14:46.479 "rw_ios_per_sec": 0, 00:14:46.479 "rw_mbytes_per_sec": 0, 00:14:46.479 "r_mbytes_per_sec": 0, 00:14:46.479 "w_mbytes_per_sec": 0 00:14:46.479 }, 00:14:46.479 "claimed": true, 00:14:46.479 "claim_type": "exclusive_write", 00:14:46.479 "zoned": false, 00:14:46.479 "supported_io_types": { 00:14:46.479 "read": true, 00:14:46.479 "write": true, 00:14:46.479 "unmap": true, 00:14:46.479 "write_zeroes": true, 00:14:46.479 "flush": true, 00:14:46.479 "reset": true, 00:14:46.479 "compare": false, 00:14:46.479 "compare_and_write": false, 00:14:46.479 "abort": true, 00:14:46.479 "nvme_admin": false, 00:14:46.479 "nvme_io": false 00:14:46.479 }, 00:14:46.479 "memory_domains": [ 00:14:46.479 { 00:14:46.479 "dma_device_id": "system", 00:14:46.479 "dma_device_type": 1 00:14:46.479 }, 00:14:46.479 { 00:14:46.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.479 "dma_device_type": 2 00:14:46.479 } 00:14:46.479 ], 00:14:46.479 "driver_specific": {} 00:14:46.479 } 00:14:46.479 ] 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.479 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.740 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.740 "name": "Existed_Raid", 00:14:46.740 "uuid": "c5a21270-9edc-4492-867b-a22474b7acdb", 00:14:46.740 "strip_size_kb": 0, 00:14:46.740 "state": "online", 00:14:46.740 "raid_level": "raid1", 00:14:46.740 "superblock": false, 00:14:46.740 "num_base_bdevs": 3, 00:14:46.740 "num_base_bdevs_discovered": 3, 00:14:46.740 "num_base_bdevs_operational": 3, 00:14:46.740 "base_bdevs_list": [ 00:14:46.740 { 00:14:46.740 "name": "BaseBdev1", 00:14:46.740 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:46.740 "is_configured": true, 00:14:46.740 "data_offset": 0, 00:14:46.740 "data_size": 65536 00:14:46.740 }, 00:14:46.740 { 00:14:46.740 "name": "BaseBdev2", 00:14:46.740 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:46.740 "is_configured": true, 00:14:46.740 "data_offset": 0, 00:14:46.740 "data_size": 65536 00:14:46.740 }, 00:14:46.740 { 00:14:46.740 "name": "BaseBdev3", 00:14:46.740 "uuid": "379053b8-55af-43c4-b62d-6d2dffb349aa", 00:14:46.740 "is_configured": true, 00:14:46.740 "data_offset": 0, 00:14:46.740 "data_size": 65536 00:14:46.740 } 00:14:46.740 ] 00:14:46.740 }' 00:14:46.740 10:10:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.740 10:10:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:47.310 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:47.614 [2024-06-10 10:10:09.234083] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:47.614 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:47.614 "name": "Existed_Raid", 00:14:47.614 "aliases": [ 00:14:47.614 "c5a21270-9edc-4492-867b-a22474b7acdb" 00:14:47.614 ], 00:14:47.614 "product_name": "Raid Volume", 00:14:47.614 "block_size": 512, 00:14:47.614 "num_blocks": 65536, 00:14:47.614 "uuid": "c5a21270-9edc-4492-867b-a22474b7acdb", 00:14:47.614 "assigned_rate_limits": { 00:14:47.614 "rw_ios_per_sec": 0, 00:14:47.614 "rw_mbytes_per_sec": 0, 00:14:47.614 "r_mbytes_per_sec": 0, 00:14:47.614 "w_mbytes_per_sec": 0 00:14:47.614 }, 00:14:47.614 "claimed": false, 00:14:47.614 "zoned": false, 00:14:47.614 "supported_io_types": { 00:14:47.614 "read": true, 00:14:47.614 "write": true, 00:14:47.614 "unmap": false, 00:14:47.614 "write_zeroes": true, 00:14:47.614 "flush": false, 00:14:47.614 "reset": true, 00:14:47.614 "compare": false, 00:14:47.614 "compare_and_write": false, 00:14:47.614 "abort": false, 00:14:47.614 "nvme_admin": false, 00:14:47.614 "nvme_io": false 00:14:47.614 }, 00:14:47.614 "memory_domains": [ 00:14:47.614 { 00:14:47.614 "dma_device_id": "system", 00:14:47.614 "dma_device_type": 1 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.614 "dma_device_type": 2 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "dma_device_id": "system", 00:14:47.614 "dma_device_type": 1 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.614 "dma_device_type": 2 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "dma_device_id": "system", 00:14:47.614 "dma_device_type": 1 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.614 "dma_device_type": 2 00:14:47.614 } 00:14:47.614 ], 00:14:47.614 "driver_specific": { 00:14:47.614 "raid": { 00:14:47.614 "uuid": "c5a21270-9edc-4492-867b-a22474b7acdb", 00:14:47.614 "strip_size_kb": 0, 00:14:47.614 "state": "online", 00:14:47.614 "raid_level": "raid1", 00:14:47.614 "superblock": false, 00:14:47.614 "num_base_bdevs": 3, 00:14:47.614 "num_base_bdevs_discovered": 3, 00:14:47.614 "num_base_bdevs_operational": 3, 00:14:47.614 "base_bdevs_list": [ 00:14:47.614 { 00:14:47.614 "name": "BaseBdev1", 00:14:47.614 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:47.614 "is_configured": true, 00:14:47.614 "data_offset": 0, 00:14:47.614 "data_size": 65536 00:14:47.614 }, 00:14:47.614 { 00:14:47.614 "name": "BaseBdev2", 00:14:47.614 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:47.614 "is_configured": true, 00:14:47.615 "data_offset": 0, 00:14:47.615 "data_size": 65536 00:14:47.615 }, 00:14:47.615 { 00:14:47.615 "name": "BaseBdev3", 00:14:47.615 "uuid": "379053b8-55af-43c4-b62d-6d2dffb349aa", 00:14:47.615 "is_configured": true, 00:14:47.615 "data_offset": 0, 00:14:47.615 "data_size": 65536 00:14:47.615 } 00:14:47.615 ] 00:14:47.615 } 00:14:47.615 } 00:14:47.615 }' 00:14:47.615 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:47.615 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:47.615 BaseBdev2 00:14:47.615 BaseBdev3' 00:14:47.615 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.615 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:47.615 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:47.876 "name": "BaseBdev1", 00:14:47.876 "aliases": [ 00:14:47.876 "50953d9b-5f29-470e-bddb-555e14cf4f0e" 00:14:47.876 ], 00:14:47.876 "product_name": "Malloc disk", 00:14:47.876 "block_size": 512, 00:14:47.876 "num_blocks": 65536, 00:14:47.876 "uuid": "50953d9b-5f29-470e-bddb-555e14cf4f0e", 00:14:47.876 "assigned_rate_limits": { 00:14:47.876 "rw_ios_per_sec": 0, 00:14:47.876 "rw_mbytes_per_sec": 0, 00:14:47.876 "r_mbytes_per_sec": 0, 00:14:47.876 "w_mbytes_per_sec": 0 00:14:47.876 }, 00:14:47.876 "claimed": true, 00:14:47.876 "claim_type": "exclusive_write", 00:14:47.876 "zoned": false, 00:14:47.876 "supported_io_types": { 00:14:47.876 "read": true, 00:14:47.876 "write": true, 00:14:47.876 "unmap": true, 00:14:47.876 "write_zeroes": true, 00:14:47.876 "flush": true, 00:14:47.876 "reset": true, 00:14:47.876 "compare": false, 00:14:47.876 "compare_and_write": false, 00:14:47.876 "abort": true, 00:14:47.876 "nvme_admin": false, 00:14:47.876 "nvme_io": false 00:14:47.876 }, 00:14:47.876 "memory_domains": [ 00:14:47.876 { 00:14:47.876 "dma_device_id": "system", 00:14:47.876 "dma_device_type": 1 00:14:47.876 }, 00:14:47.876 { 00:14:47.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.876 "dma_device_type": 2 00:14:47.876 } 00:14:47.876 ], 00:14:47.876 "driver_specific": {} 00:14:47.876 }' 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.876 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:48.137 10:10:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.399 "name": "BaseBdev2", 00:14:48.399 "aliases": [ 00:14:48.399 "be9eb103-04d6-4949-a6ae-6b3347076552" 00:14:48.399 ], 00:14:48.399 "product_name": "Malloc disk", 00:14:48.399 "block_size": 512, 00:14:48.399 "num_blocks": 65536, 00:14:48.399 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:48.399 "assigned_rate_limits": { 00:14:48.399 "rw_ios_per_sec": 0, 00:14:48.399 "rw_mbytes_per_sec": 0, 00:14:48.399 "r_mbytes_per_sec": 0, 00:14:48.399 "w_mbytes_per_sec": 0 00:14:48.399 }, 00:14:48.399 "claimed": true, 00:14:48.399 "claim_type": "exclusive_write", 00:14:48.399 "zoned": false, 00:14:48.399 "supported_io_types": { 00:14:48.399 "read": true, 00:14:48.399 "write": true, 00:14:48.399 "unmap": true, 00:14:48.399 "write_zeroes": true, 00:14:48.399 "flush": true, 00:14:48.399 "reset": true, 00:14:48.399 "compare": false, 00:14:48.399 "compare_and_write": false, 00:14:48.399 "abort": true, 00:14:48.399 "nvme_admin": false, 00:14:48.399 "nvme_io": false 00:14:48.399 }, 00:14:48.399 "memory_domains": [ 00:14:48.399 { 00:14:48.399 "dma_device_id": "system", 00:14:48.399 "dma_device_type": 1 00:14:48.399 }, 00:14:48.399 { 00:14:48.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.399 "dma_device_type": 2 00:14:48.399 } 00:14:48.399 ], 00:14:48.399 "driver_specific": {} 00:14:48.399 }' 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.399 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:48.660 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.921 "name": "BaseBdev3", 00:14:48.921 "aliases": [ 00:14:48.921 "379053b8-55af-43c4-b62d-6d2dffb349aa" 00:14:48.921 ], 00:14:48.921 "product_name": "Malloc disk", 00:14:48.921 "block_size": 512, 00:14:48.921 "num_blocks": 65536, 00:14:48.921 "uuid": "379053b8-55af-43c4-b62d-6d2dffb349aa", 00:14:48.921 "assigned_rate_limits": { 00:14:48.921 "rw_ios_per_sec": 0, 00:14:48.921 "rw_mbytes_per_sec": 0, 00:14:48.921 "r_mbytes_per_sec": 0, 00:14:48.921 "w_mbytes_per_sec": 0 00:14:48.921 }, 00:14:48.921 "claimed": true, 00:14:48.921 "claim_type": "exclusive_write", 00:14:48.921 "zoned": false, 00:14:48.921 "supported_io_types": { 00:14:48.921 "read": true, 00:14:48.921 "write": true, 00:14:48.921 "unmap": true, 00:14:48.921 "write_zeroes": true, 00:14:48.921 "flush": true, 00:14:48.921 "reset": true, 00:14:48.921 "compare": false, 00:14:48.921 "compare_and_write": false, 00:14:48.921 "abort": true, 00:14:48.921 "nvme_admin": false, 00:14:48.921 "nvme_io": false 00:14:48.921 }, 00:14:48.921 "memory_domains": [ 00:14:48.921 { 00:14:48.921 "dma_device_id": "system", 00:14:48.921 "dma_device_type": 1 00:14:48.921 }, 00:14:48.921 { 00:14:48.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.921 "dma_device_type": 2 00:14:48.921 } 00:14:48.921 ], 00:14:48.921 "driver_specific": {} 00:14:48.921 }' 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.921 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:49.187 10:10:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:49.449 [2024-06-10 10:10:11.098622] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.449 "name": "Existed_Raid", 00:14:49.449 "uuid": "c5a21270-9edc-4492-867b-a22474b7acdb", 00:14:49.449 "strip_size_kb": 0, 00:14:49.449 "state": "online", 00:14:49.449 "raid_level": "raid1", 00:14:49.449 "superblock": false, 00:14:49.449 "num_base_bdevs": 3, 00:14:49.449 "num_base_bdevs_discovered": 2, 00:14:49.449 "num_base_bdevs_operational": 2, 00:14:49.449 "base_bdevs_list": [ 00:14:49.449 { 00:14:49.449 "name": null, 00:14:49.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.449 "is_configured": false, 00:14:49.449 "data_offset": 0, 00:14:49.449 "data_size": 65536 00:14:49.449 }, 00:14:49.449 { 00:14:49.449 "name": "BaseBdev2", 00:14:49.449 "uuid": "be9eb103-04d6-4949-a6ae-6b3347076552", 00:14:49.449 "is_configured": true, 00:14:49.449 "data_offset": 0, 00:14:49.449 "data_size": 65536 00:14:49.449 }, 00:14:49.449 { 00:14:49.449 "name": "BaseBdev3", 00:14:49.449 "uuid": "379053b8-55af-43c4-b62d-6d2dffb349aa", 00:14:49.449 "is_configured": true, 00:14:49.449 "data_offset": 0, 00:14:49.449 "data_size": 65536 00:14:49.449 } 00:14:49.449 ] 00:14:49.449 }' 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.449 10:10:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.021 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:50.021 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:50.021 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.021 10:10:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:50.282 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:50.282 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:50.282 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:50.542 [2024-06-10 10:10:12.233495] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:50.542 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:50.542 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:50.542 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.542 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:50.802 [2024-06-10 10:10:12.616324] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:50.802 [2024-06-10 10:10:12.616383] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:50.802 [2024-06-10 10:10:12.622312] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:50.802 [2024-06-10 10:10:12.622336] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:50.802 [2024-06-10 10:10:12.622342] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x160c2c0 name Existed_Raid, state offline 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.802 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:51.063 10:10:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:51.323 BaseBdev2 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:51.323 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.584 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:51.584 [ 00:14:51.584 { 00:14:51.584 "name": "BaseBdev2", 00:14:51.584 "aliases": [ 00:14:51.584 "62ef8b89-37f6-4917-8f66-682995a7adcc" 00:14:51.584 ], 00:14:51.584 "product_name": "Malloc disk", 00:14:51.584 "block_size": 512, 00:14:51.584 "num_blocks": 65536, 00:14:51.584 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:51.584 "assigned_rate_limits": { 00:14:51.584 "rw_ios_per_sec": 0, 00:14:51.584 "rw_mbytes_per_sec": 0, 00:14:51.584 "r_mbytes_per_sec": 0, 00:14:51.584 "w_mbytes_per_sec": 0 00:14:51.584 }, 00:14:51.584 "claimed": false, 00:14:51.584 "zoned": false, 00:14:51.584 "supported_io_types": { 00:14:51.584 "read": true, 00:14:51.584 "write": true, 00:14:51.584 "unmap": true, 00:14:51.584 "write_zeroes": true, 00:14:51.584 "flush": true, 00:14:51.584 "reset": true, 00:14:51.584 "compare": false, 00:14:51.584 "compare_and_write": false, 00:14:51.584 "abort": true, 00:14:51.584 "nvme_admin": false, 00:14:51.584 "nvme_io": false 00:14:51.584 }, 00:14:51.584 "memory_domains": [ 00:14:51.584 { 00:14:51.584 "dma_device_id": "system", 00:14:51.584 "dma_device_type": 1 00:14:51.584 }, 00:14:51.584 { 00:14:51.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.584 "dma_device_type": 2 00:14:51.584 } 00:14:51.584 ], 00:14:51.584 "driver_specific": {} 00:14:51.584 } 00:14:51.584 ] 00:14:51.584 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:51.584 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:51.584 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:51.584 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:51.845 BaseBdev3 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:51.845 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.106 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:52.106 [ 00:14:52.106 { 00:14:52.106 "name": "BaseBdev3", 00:14:52.106 "aliases": [ 00:14:52.106 "d265d554-213d-4f08-b9f8-a3439c3d4f08" 00:14:52.106 ], 00:14:52.106 "product_name": "Malloc disk", 00:14:52.106 "block_size": 512, 00:14:52.106 "num_blocks": 65536, 00:14:52.106 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:52.106 "assigned_rate_limits": { 00:14:52.106 "rw_ios_per_sec": 0, 00:14:52.106 "rw_mbytes_per_sec": 0, 00:14:52.106 "r_mbytes_per_sec": 0, 00:14:52.106 "w_mbytes_per_sec": 0 00:14:52.106 }, 00:14:52.106 "claimed": false, 00:14:52.106 "zoned": false, 00:14:52.106 "supported_io_types": { 00:14:52.106 "read": true, 00:14:52.106 "write": true, 00:14:52.106 "unmap": true, 00:14:52.106 "write_zeroes": true, 00:14:52.106 "flush": true, 00:14:52.106 "reset": true, 00:14:52.106 "compare": false, 00:14:52.106 "compare_and_write": false, 00:14:52.106 "abort": true, 00:14:52.106 "nvme_admin": false, 00:14:52.106 "nvme_io": false 00:14:52.106 }, 00:14:52.106 "memory_domains": [ 00:14:52.106 { 00:14:52.106 "dma_device_id": "system", 00:14:52.106 "dma_device_type": 1 00:14:52.106 }, 00:14:52.106 { 00:14:52.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.106 "dma_device_type": 2 00:14:52.106 } 00:14:52.106 ], 00:14:52.106 "driver_specific": {} 00:14:52.106 } 00:14:52.106 ] 00:14:52.106 10:10:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:52.106 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:52.106 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:52.106 10:10:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:52.367 [2024-06-10 10:10:14.099928] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:52.367 [2024-06-10 10:10:14.099955] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:52.367 [2024-06-10 10:10:14.099967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:52.367 [2024-06-10 10:10:14.100986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.367 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.628 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.628 "name": "Existed_Raid", 00:14:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.628 "strip_size_kb": 0, 00:14:52.628 "state": "configuring", 00:14:52.628 "raid_level": "raid1", 00:14:52.628 "superblock": false, 00:14:52.628 "num_base_bdevs": 3, 00:14:52.628 "num_base_bdevs_discovered": 2, 00:14:52.628 "num_base_bdevs_operational": 3, 00:14:52.628 "base_bdevs_list": [ 00:14:52.628 { 00:14:52.628 "name": "BaseBdev1", 00:14:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.628 "is_configured": false, 00:14:52.628 "data_offset": 0, 00:14:52.628 "data_size": 0 00:14:52.628 }, 00:14:52.628 { 00:14:52.628 "name": "BaseBdev2", 00:14:52.628 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:52.628 "is_configured": true, 00:14:52.628 "data_offset": 0, 00:14:52.628 "data_size": 65536 00:14:52.628 }, 00:14:52.628 { 00:14:52.628 "name": "BaseBdev3", 00:14:52.628 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:52.628 "is_configured": true, 00:14:52.628 "data_offset": 0, 00:14:52.628 "data_size": 65536 00:14:52.628 } 00:14:52.628 ] 00:14:52.628 }' 00:14:52.628 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.628 10:10:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.200 10:10:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:53.200 [2024-06-10 10:10:15.010206] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.200 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.461 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.461 "name": "Existed_Raid", 00:14:53.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.461 "strip_size_kb": 0, 00:14:53.461 "state": "configuring", 00:14:53.461 "raid_level": "raid1", 00:14:53.461 "superblock": false, 00:14:53.461 "num_base_bdevs": 3, 00:14:53.461 "num_base_bdevs_discovered": 1, 00:14:53.461 "num_base_bdevs_operational": 3, 00:14:53.461 "base_bdevs_list": [ 00:14:53.461 { 00:14:53.461 "name": "BaseBdev1", 00:14:53.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.461 "is_configured": false, 00:14:53.461 "data_offset": 0, 00:14:53.461 "data_size": 0 00:14:53.461 }, 00:14:53.461 { 00:14:53.461 "name": null, 00:14:53.461 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:53.461 "is_configured": false, 00:14:53.461 "data_offset": 0, 00:14:53.461 "data_size": 65536 00:14:53.461 }, 00:14:53.461 { 00:14:53.461 "name": "BaseBdev3", 00:14:53.461 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:53.462 "is_configured": true, 00:14:53.462 "data_offset": 0, 00:14:53.462 "data_size": 65536 00:14:53.462 } 00:14:53.462 ] 00:14:53.462 }' 00:14:53.462 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.462 10:10:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.034 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.034 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:54.294 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:54.294 10:10:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:54.294 [2024-06-10 10:10:16.113902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:54.294 BaseBdev1 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:54.294 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:54.555 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:54.816 [ 00:14:54.816 { 00:14:54.816 "name": "BaseBdev1", 00:14:54.816 "aliases": [ 00:14:54.816 "68375251-5623-4801-843a-848a61526846" 00:14:54.816 ], 00:14:54.816 "product_name": "Malloc disk", 00:14:54.816 "block_size": 512, 00:14:54.816 "num_blocks": 65536, 00:14:54.816 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:54.816 "assigned_rate_limits": { 00:14:54.816 "rw_ios_per_sec": 0, 00:14:54.816 "rw_mbytes_per_sec": 0, 00:14:54.816 "r_mbytes_per_sec": 0, 00:14:54.816 "w_mbytes_per_sec": 0 00:14:54.816 }, 00:14:54.816 "claimed": true, 00:14:54.816 "claim_type": "exclusive_write", 00:14:54.816 "zoned": false, 00:14:54.816 "supported_io_types": { 00:14:54.816 "read": true, 00:14:54.816 "write": true, 00:14:54.816 "unmap": true, 00:14:54.816 "write_zeroes": true, 00:14:54.816 "flush": true, 00:14:54.816 "reset": true, 00:14:54.816 "compare": false, 00:14:54.816 "compare_and_write": false, 00:14:54.816 "abort": true, 00:14:54.816 "nvme_admin": false, 00:14:54.816 "nvme_io": false 00:14:54.816 }, 00:14:54.816 "memory_domains": [ 00:14:54.816 { 00:14:54.816 "dma_device_id": "system", 00:14:54.816 "dma_device_type": 1 00:14:54.816 }, 00:14:54.816 { 00:14:54.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.816 "dma_device_type": 2 00:14:54.816 } 00:14:54.816 ], 00:14:54.816 "driver_specific": {} 00:14:54.816 } 00:14:54.816 ] 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.816 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.110 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.110 "name": "Existed_Raid", 00:14:55.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.110 "strip_size_kb": 0, 00:14:55.110 "state": "configuring", 00:14:55.110 "raid_level": "raid1", 00:14:55.110 "superblock": false, 00:14:55.110 "num_base_bdevs": 3, 00:14:55.110 "num_base_bdevs_discovered": 2, 00:14:55.110 "num_base_bdevs_operational": 3, 00:14:55.110 "base_bdevs_list": [ 00:14:55.110 { 00:14:55.110 "name": "BaseBdev1", 00:14:55.110 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:55.110 "is_configured": true, 00:14:55.110 "data_offset": 0, 00:14:55.110 "data_size": 65536 00:14:55.110 }, 00:14:55.110 { 00:14:55.110 "name": null, 00:14:55.110 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:55.110 "is_configured": false, 00:14:55.110 "data_offset": 0, 00:14:55.110 "data_size": 65536 00:14:55.110 }, 00:14:55.110 { 00:14:55.110 "name": "BaseBdev3", 00:14:55.110 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:55.110 "is_configured": true, 00:14:55.110 "data_offset": 0, 00:14:55.110 "data_size": 65536 00:14:55.110 } 00:14:55.110 ] 00:14:55.110 }' 00:14:55.110 10:10:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.110 10:10:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.705 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.705 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:55.705 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:55.705 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:55.966 [2024-06-10 10:10:17.653812] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.966 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.227 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.227 "name": "Existed_Raid", 00:14:56.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.227 "strip_size_kb": 0, 00:14:56.227 "state": "configuring", 00:14:56.227 "raid_level": "raid1", 00:14:56.227 "superblock": false, 00:14:56.227 "num_base_bdevs": 3, 00:14:56.227 "num_base_bdevs_discovered": 1, 00:14:56.227 "num_base_bdevs_operational": 3, 00:14:56.227 "base_bdevs_list": [ 00:14:56.227 { 00:14:56.227 "name": "BaseBdev1", 00:14:56.227 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:56.227 "is_configured": true, 00:14:56.227 "data_offset": 0, 00:14:56.227 "data_size": 65536 00:14:56.227 }, 00:14:56.227 { 00:14:56.227 "name": null, 00:14:56.227 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:56.227 "is_configured": false, 00:14:56.227 "data_offset": 0, 00:14:56.227 "data_size": 65536 00:14:56.227 }, 00:14:56.227 { 00:14:56.227 "name": null, 00:14:56.227 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:56.227 "is_configured": false, 00:14:56.227 "data_offset": 0, 00:14:56.227 "data_size": 65536 00:14:56.227 } 00:14:56.227 ] 00:14:56.227 }' 00:14:56.227 10:10:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.227 10:10:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.801 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.801 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:56.801 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:56.801 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:57.062 [2024-06-10 10:10:18.764622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.062 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.322 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.322 "name": "Existed_Raid", 00:14:57.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.323 "strip_size_kb": 0, 00:14:57.323 "state": "configuring", 00:14:57.323 "raid_level": "raid1", 00:14:57.323 "superblock": false, 00:14:57.323 "num_base_bdevs": 3, 00:14:57.323 "num_base_bdevs_discovered": 2, 00:14:57.323 "num_base_bdevs_operational": 3, 00:14:57.323 "base_bdevs_list": [ 00:14:57.323 { 00:14:57.323 "name": "BaseBdev1", 00:14:57.323 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:57.323 "is_configured": true, 00:14:57.323 "data_offset": 0, 00:14:57.323 "data_size": 65536 00:14:57.323 }, 00:14:57.323 { 00:14:57.323 "name": null, 00:14:57.323 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:57.323 "is_configured": false, 00:14:57.323 "data_offset": 0, 00:14:57.323 "data_size": 65536 00:14:57.323 }, 00:14:57.323 { 00:14:57.323 "name": "BaseBdev3", 00:14:57.323 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:57.323 "is_configured": true, 00:14:57.323 "data_offset": 0, 00:14:57.323 "data_size": 65536 00:14:57.323 } 00:14:57.323 ] 00:14:57.323 }' 00:14:57.323 10:10:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.323 10:10:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.896 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.896 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:57.896 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:57.896 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:58.157 [2024-06-10 10:10:19.891492] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.157 10:10:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.419 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.419 "name": "Existed_Raid", 00:14:58.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.419 "strip_size_kb": 0, 00:14:58.419 "state": "configuring", 00:14:58.419 "raid_level": "raid1", 00:14:58.419 "superblock": false, 00:14:58.419 "num_base_bdevs": 3, 00:14:58.419 "num_base_bdevs_discovered": 1, 00:14:58.419 "num_base_bdevs_operational": 3, 00:14:58.419 "base_bdevs_list": [ 00:14:58.419 { 00:14:58.419 "name": null, 00:14:58.419 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:58.419 "is_configured": false, 00:14:58.419 "data_offset": 0, 00:14:58.419 "data_size": 65536 00:14:58.419 }, 00:14:58.419 { 00:14:58.419 "name": null, 00:14:58.419 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:58.419 "is_configured": false, 00:14:58.419 "data_offset": 0, 00:14:58.419 "data_size": 65536 00:14:58.419 }, 00:14:58.419 { 00:14:58.419 "name": "BaseBdev3", 00:14:58.419 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:58.419 "is_configured": true, 00:14:58.419 "data_offset": 0, 00:14:58.419 "data_size": 65536 00:14:58.419 } 00:14:58.419 ] 00:14:58.419 }' 00:14:58.419 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.419 10:10:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.992 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.992 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:58.992 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:58.992 10:10:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:59.254 [2024-06-10 10:10:20.996063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.254 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.515 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.515 "name": "Existed_Raid", 00:14:59.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.515 "strip_size_kb": 0, 00:14:59.515 "state": "configuring", 00:14:59.515 "raid_level": "raid1", 00:14:59.515 "superblock": false, 00:14:59.515 "num_base_bdevs": 3, 00:14:59.515 "num_base_bdevs_discovered": 2, 00:14:59.515 "num_base_bdevs_operational": 3, 00:14:59.515 "base_bdevs_list": [ 00:14:59.515 { 00:14:59.515 "name": null, 00:14:59.515 "uuid": "68375251-5623-4801-843a-848a61526846", 00:14:59.515 "is_configured": false, 00:14:59.515 "data_offset": 0, 00:14:59.515 "data_size": 65536 00:14:59.515 }, 00:14:59.515 { 00:14:59.515 "name": "BaseBdev2", 00:14:59.515 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:14:59.515 "is_configured": true, 00:14:59.515 "data_offset": 0, 00:14:59.515 "data_size": 65536 00:14:59.515 }, 00:14:59.515 { 00:14:59.515 "name": "BaseBdev3", 00:14:59.515 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:14:59.516 "is_configured": true, 00:14:59.516 "data_offset": 0, 00:14:59.516 "data_size": 65536 00:14:59.516 } 00:14:59.516 ] 00:14:59.516 }' 00:14:59.516 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.516 10:10:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.089 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.089 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:00.089 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:00.089 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.089 10:10:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:00.349 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 68375251-5623-4801-843a-848a61526846 00:15:00.611 [2024-06-10 10:10:22.296225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:00.611 [2024-06-10 10:10:22.296248] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b20e0 00:15:00.611 [2024-06-10 10:10:22.296252] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:00.611 [2024-06-10 10:10:22.296399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b23c0 00:15:00.611 [2024-06-10 10:10:22.296494] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b20e0 00:15:00.611 [2024-06-10 10:10:22.296499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17b20e0 00:15:00.611 [2024-06-10 10:10:22.296616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.611 NewBaseBdev 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:00.611 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:00.872 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:00.872 [ 00:15:00.872 { 00:15:00.872 "name": "NewBaseBdev", 00:15:00.872 "aliases": [ 00:15:00.872 "68375251-5623-4801-843a-848a61526846" 00:15:00.872 ], 00:15:00.872 "product_name": "Malloc disk", 00:15:00.872 "block_size": 512, 00:15:00.872 "num_blocks": 65536, 00:15:00.872 "uuid": "68375251-5623-4801-843a-848a61526846", 00:15:00.872 "assigned_rate_limits": { 00:15:00.872 "rw_ios_per_sec": 0, 00:15:00.872 "rw_mbytes_per_sec": 0, 00:15:00.872 "r_mbytes_per_sec": 0, 00:15:00.872 "w_mbytes_per_sec": 0 00:15:00.872 }, 00:15:00.872 "claimed": true, 00:15:00.872 "claim_type": "exclusive_write", 00:15:00.872 "zoned": false, 00:15:00.872 "supported_io_types": { 00:15:00.872 "read": true, 00:15:00.872 "write": true, 00:15:00.872 "unmap": true, 00:15:00.872 "write_zeroes": true, 00:15:00.872 "flush": true, 00:15:00.872 "reset": true, 00:15:00.872 "compare": false, 00:15:00.872 "compare_and_write": false, 00:15:00.872 "abort": true, 00:15:00.872 "nvme_admin": false, 00:15:00.873 "nvme_io": false 00:15:00.873 }, 00:15:00.873 "memory_domains": [ 00:15:00.873 { 00:15:00.873 "dma_device_id": "system", 00:15:00.873 "dma_device_type": 1 00:15:00.873 }, 00:15:00.873 { 00:15:00.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.873 "dma_device_type": 2 00:15:00.873 } 00:15:00.873 ], 00:15:00.873 "driver_specific": {} 00:15:00.873 } 00:15:00.873 ] 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.873 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.134 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.134 "name": "Existed_Raid", 00:15:01.134 "uuid": "0b160288-451b-4d9b-91fb-3b7906cee9c6", 00:15:01.134 "strip_size_kb": 0, 00:15:01.134 "state": "online", 00:15:01.134 "raid_level": "raid1", 00:15:01.134 "superblock": false, 00:15:01.134 "num_base_bdevs": 3, 00:15:01.134 "num_base_bdevs_discovered": 3, 00:15:01.134 "num_base_bdevs_operational": 3, 00:15:01.134 "base_bdevs_list": [ 00:15:01.134 { 00:15:01.134 "name": "NewBaseBdev", 00:15:01.134 "uuid": "68375251-5623-4801-843a-848a61526846", 00:15:01.134 "is_configured": true, 00:15:01.134 "data_offset": 0, 00:15:01.134 "data_size": 65536 00:15:01.134 }, 00:15:01.134 { 00:15:01.134 "name": "BaseBdev2", 00:15:01.134 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:15:01.134 "is_configured": true, 00:15:01.134 "data_offset": 0, 00:15:01.134 "data_size": 65536 00:15:01.134 }, 00:15:01.134 { 00:15:01.134 "name": "BaseBdev3", 00:15:01.134 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:15:01.134 "is_configured": true, 00:15:01.134 "data_offset": 0, 00:15:01.134 "data_size": 65536 00:15:01.134 } 00:15:01.134 ] 00:15:01.134 }' 00:15:01.134 10:10:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.134 10:10:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:01.708 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:01.970 [2024-06-10 10:10:23.595718] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:01.970 "name": "Existed_Raid", 00:15:01.970 "aliases": [ 00:15:01.970 "0b160288-451b-4d9b-91fb-3b7906cee9c6" 00:15:01.970 ], 00:15:01.970 "product_name": "Raid Volume", 00:15:01.970 "block_size": 512, 00:15:01.970 "num_blocks": 65536, 00:15:01.970 "uuid": "0b160288-451b-4d9b-91fb-3b7906cee9c6", 00:15:01.970 "assigned_rate_limits": { 00:15:01.970 "rw_ios_per_sec": 0, 00:15:01.970 "rw_mbytes_per_sec": 0, 00:15:01.970 "r_mbytes_per_sec": 0, 00:15:01.970 "w_mbytes_per_sec": 0 00:15:01.970 }, 00:15:01.970 "claimed": false, 00:15:01.970 "zoned": false, 00:15:01.970 "supported_io_types": { 00:15:01.970 "read": true, 00:15:01.970 "write": true, 00:15:01.970 "unmap": false, 00:15:01.970 "write_zeroes": true, 00:15:01.970 "flush": false, 00:15:01.970 "reset": true, 00:15:01.970 "compare": false, 00:15:01.970 "compare_and_write": false, 00:15:01.970 "abort": false, 00:15:01.970 "nvme_admin": false, 00:15:01.970 "nvme_io": false 00:15:01.970 }, 00:15:01.970 "memory_domains": [ 00:15:01.970 { 00:15:01.970 "dma_device_id": "system", 00:15:01.970 "dma_device_type": 1 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.970 "dma_device_type": 2 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "dma_device_id": "system", 00:15:01.970 "dma_device_type": 1 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.970 "dma_device_type": 2 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "dma_device_id": "system", 00:15:01.970 "dma_device_type": 1 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.970 "dma_device_type": 2 00:15:01.970 } 00:15:01.970 ], 00:15:01.970 "driver_specific": { 00:15:01.970 "raid": { 00:15:01.970 "uuid": "0b160288-451b-4d9b-91fb-3b7906cee9c6", 00:15:01.970 "strip_size_kb": 0, 00:15:01.970 "state": "online", 00:15:01.970 "raid_level": "raid1", 00:15:01.970 "superblock": false, 00:15:01.970 "num_base_bdevs": 3, 00:15:01.970 "num_base_bdevs_discovered": 3, 00:15:01.970 "num_base_bdevs_operational": 3, 00:15:01.970 "base_bdevs_list": [ 00:15:01.970 { 00:15:01.970 "name": "NewBaseBdev", 00:15:01.970 "uuid": "68375251-5623-4801-843a-848a61526846", 00:15:01.970 "is_configured": true, 00:15:01.970 "data_offset": 0, 00:15:01.970 "data_size": 65536 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "name": "BaseBdev2", 00:15:01.970 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:15:01.970 "is_configured": true, 00:15:01.970 "data_offset": 0, 00:15:01.970 "data_size": 65536 00:15:01.970 }, 00:15:01.970 { 00:15:01.970 "name": "BaseBdev3", 00:15:01.970 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:15:01.970 "is_configured": true, 00:15:01.970 "data_offset": 0, 00:15:01.970 "data_size": 65536 00:15:01.970 } 00:15:01.970 ] 00:15:01.970 } 00:15:01.970 } 00:15:01.970 }' 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:01.970 BaseBdev2 00:15:01.970 BaseBdev3' 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:01.970 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.232 "name": "NewBaseBdev", 00:15:02.232 "aliases": [ 00:15:02.232 "68375251-5623-4801-843a-848a61526846" 00:15:02.232 ], 00:15:02.232 "product_name": "Malloc disk", 00:15:02.232 "block_size": 512, 00:15:02.232 "num_blocks": 65536, 00:15:02.232 "uuid": "68375251-5623-4801-843a-848a61526846", 00:15:02.232 "assigned_rate_limits": { 00:15:02.232 "rw_ios_per_sec": 0, 00:15:02.232 "rw_mbytes_per_sec": 0, 00:15:02.232 "r_mbytes_per_sec": 0, 00:15:02.232 "w_mbytes_per_sec": 0 00:15:02.232 }, 00:15:02.232 "claimed": true, 00:15:02.232 "claim_type": "exclusive_write", 00:15:02.232 "zoned": false, 00:15:02.232 "supported_io_types": { 00:15:02.232 "read": true, 00:15:02.232 "write": true, 00:15:02.232 "unmap": true, 00:15:02.232 "write_zeroes": true, 00:15:02.232 "flush": true, 00:15:02.232 "reset": true, 00:15:02.232 "compare": false, 00:15:02.232 "compare_and_write": false, 00:15:02.232 "abort": true, 00:15:02.232 "nvme_admin": false, 00:15:02.232 "nvme_io": false 00:15:02.232 }, 00:15:02.232 "memory_domains": [ 00:15:02.232 { 00:15:02.232 "dma_device_id": "system", 00:15:02.232 "dma_device_type": 1 00:15:02.232 }, 00:15:02.232 { 00:15:02.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.232 "dma_device_type": 2 00:15:02.232 } 00:15:02.232 ], 00:15:02.232 "driver_specific": {} 00:15:02.232 }' 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.232 10:10:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.232 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.232 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.232 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.232 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.232 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.493 "name": "BaseBdev2", 00:15:02.493 "aliases": [ 00:15:02.493 "62ef8b89-37f6-4917-8f66-682995a7adcc" 00:15:02.493 ], 00:15:02.493 "product_name": "Malloc disk", 00:15:02.493 "block_size": 512, 00:15:02.493 "num_blocks": 65536, 00:15:02.493 "uuid": "62ef8b89-37f6-4917-8f66-682995a7adcc", 00:15:02.493 "assigned_rate_limits": { 00:15:02.493 "rw_ios_per_sec": 0, 00:15:02.493 "rw_mbytes_per_sec": 0, 00:15:02.493 "r_mbytes_per_sec": 0, 00:15:02.493 "w_mbytes_per_sec": 0 00:15:02.493 }, 00:15:02.493 "claimed": true, 00:15:02.493 "claim_type": "exclusive_write", 00:15:02.493 "zoned": false, 00:15:02.493 "supported_io_types": { 00:15:02.493 "read": true, 00:15:02.493 "write": true, 00:15:02.493 "unmap": true, 00:15:02.493 "write_zeroes": true, 00:15:02.493 "flush": true, 00:15:02.493 "reset": true, 00:15:02.493 "compare": false, 00:15:02.493 "compare_and_write": false, 00:15:02.493 "abort": true, 00:15:02.493 "nvme_admin": false, 00:15:02.493 "nvme_io": false 00:15:02.493 }, 00:15:02.493 "memory_domains": [ 00:15:02.493 { 00:15:02.493 "dma_device_id": "system", 00:15:02.493 "dma_device_type": 1 00:15:02.493 }, 00:15:02.493 { 00:15:02.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.493 "dma_device_type": 2 00:15:02.493 } 00:15:02.493 ], 00:15:02.493 "driver_specific": {} 00:15:02.493 }' 00:15:02.493 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.753 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.014 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.014 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.014 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.015 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.015 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.015 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:03.015 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.275 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.275 "name": "BaseBdev3", 00:15:03.275 "aliases": [ 00:15:03.275 "d265d554-213d-4f08-b9f8-a3439c3d4f08" 00:15:03.275 ], 00:15:03.275 "product_name": "Malloc disk", 00:15:03.275 "block_size": 512, 00:15:03.275 "num_blocks": 65536, 00:15:03.275 "uuid": "d265d554-213d-4f08-b9f8-a3439c3d4f08", 00:15:03.275 "assigned_rate_limits": { 00:15:03.275 "rw_ios_per_sec": 0, 00:15:03.275 "rw_mbytes_per_sec": 0, 00:15:03.275 "r_mbytes_per_sec": 0, 00:15:03.275 "w_mbytes_per_sec": 0 00:15:03.275 }, 00:15:03.275 "claimed": true, 00:15:03.275 "claim_type": "exclusive_write", 00:15:03.275 "zoned": false, 00:15:03.275 "supported_io_types": { 00:15:03.275 "read": true, 00:15:03.275 "write": true, 00:15:03.275 "unmap": true, 00:15:03.275 "write_zeroes": true, 00:15:03.275 "flush": true, 00:15:03.275 "reset": true, 00:15:03.275 "compare": false, 00:15:03.276 "compare_and_write": false, 00:15:03.276 "abort": true, 00:15:03.276 "nvme_admin": false, 00:15:03.276 "nvme_io": false 00:15:03.276 }, 00:15:03.276 "memory_domains": [ 00:15:03.276 { 00:15:03.276 "dma_device_id": "system", 00:15:03.276 "dma_device_type": 1 00:15:03.276 }, 00:15:03.276 { 00:15:03.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.276 "dma_device_type": 2 00:15:03.276 } 00:15:03.276 ], 00:15:03.276 "driver_specific": {} 00:15:03.276 }' 00:15:03.276 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.276 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.276 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.276 10:10:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.276 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.276 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.276 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.276 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.537 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.537 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.537 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.537 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.537 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:03.798 [2024-06-10 10:10:25.424144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:03.799 [2024-06-10 10:10:25.424161] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:03.799 [2024-06-10 10:10:25.424197] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:03.799 [2024-06-10 10:10:25.424397] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:03.799 [2024-06-10 10:10:25.424403] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b20e0 name Existed_Raid, state offline 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1006445 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1006445 ']' 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1006445 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1006445 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1006445' 00:15:03.799 killing process with pid 1006445 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1006445 00:15:03.799 [2024-06-10 10:10:25.498550] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1006445 00:15:03.799 [2024-06-10 10:10:25.513288] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:03.799 00:15:03.799 real 0m23.623s 00:15:03.799 user 0m44.304s 00:15:03.799 sys 0m3.459s 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:03.799 10:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.799 ************************************ 00:15:03.799 END TEST raid_state_function_test 00:15:03.799 ************************************ 00:15:04.060 10:10:25 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:15:04.060 10:10:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:04.060 10:10:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:04.060 10:10:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:04.061 ************************************ 00:15:04.061 START TEST raid_state_function_test_sb 00:15:04.061 ************************************ 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 true 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1011027 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1011027' 00:15:04.061 Process raid pid: 1011027 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1011027 /var/tmp/spdk-raid.sock 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1011027 ']' 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:04.061 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:04.061 10:10:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.061 [2024-06-10 10:10:25.771247] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:15:04.061 [2024-06-10 10:10:25.771292] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:04.061 [2024-06-10 10:10:25.857511] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:04.061 [2024-06-10 10:10:25.920814] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:04.322 [2024-06-10 10:10:25.961479] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:04.322 [2024-06-10 10:10:25.961501] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:04.895 10:10:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:04.895 10:10:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:15:04.895 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:05.167 [2024-06-10 10:10:26.780498] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:05.167 [2024-06-10 10:10:26.780528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:05.167 [2024-06-10 10:10:26.780534] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:05.167 [2024-06-10 10:10:26.780540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:05.167 [2024-06-10 10:10:26.780544] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:05.167 [2024-06-10 10:10:26.780550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.167 "name": "Existed_Raid", 00:15:05.167 "uuid": "58b8c04b-a5de-4c23-a285-1f3d5dff3b31", 00:15:05.167 "strip_size_kb": 0, 00:15:05.167 "state": "configuring", 00:15:05.167 "raid_level": "raid1", 00:15:05.167 "superblock": true, 00:15:05.167 "num_base_bdevs": 3, 00:15:05.167 "num_base_bdevs_discovered": 0, 00:15:05.167 "num_base_bdevs_operational": 3, 00:15:05.167 "base_bdevs_list": [ 00:15:05.167 { 00:15:05.167 "name": "BaseBdev1", 00:15:05.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.167 "is_configured": false, 00:15:05.167 "data_offset": 0, 00:15:05.167 "data_size": 0 00:15:05.167 }, 00:15:05.167 { 00:15:05.167 "name": "BaseBdev2", 00:15:05.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.167 "is_configured": false, 00:15:05.167 "data_offset": 0, 00:15:05.167 "data_size": 0 00:15:05.167 }, 00:15:05.167 { 00:15:05.167 "name": "BaseBdev3", 00:15:05.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.167 "is_configured": false, 00:15:05.167 "data_offset": 0, 00:15:05.167 "data_size": 0 00:15:05.167 } 00:15:05.167 ] 00:15:05.167 }' 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.167 10:10:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.740 10:10:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:06.001 [2024-06-10 10:10:27.670634] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:06.001 [2024-06-10 10:10:27.670649] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x164fb00 name Existed_Raid, state configuring 00:15:06.001 10:10:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:06.001 [2024-06-10 10:10:27.859134] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:06.001 [2024-06-10 10:10:27.859150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:06.001 [2024-06-10 10:10:27.859155] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:06.001 [2024-06-10 10:10:27.859161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:06.002 [2024-06-10 10:10:27.859165] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:06.002 [2024-06-10 10:10:27.859171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:06.262 10:10:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:06.262 [2024-06-10 10:10:28.042228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:06.262 BaseBdev1 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:06.262 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.524 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:06.786 [ 00:15:06.786 { 00:15:06.786 "name": "BaseBdev1", 00:15:06.786 "aliases": [ 00:15:06.786 "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe" 00:15:06.786 ], 00:15:06.786 "product_name": "Malloc disk", 00:15:06.786 "block_size": 512, 00:15:06.786 "num_blocks": 65536, 00:15:06.786 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:06.786 "assigned_rate_limits": { 00:15:06.786 "rw_ios_per_sec": 0, 00:15:06.786 "rw_mbytes_per_sec": 0, 00:15:06.786 "r_mbytes_per_sec": 0, 00:15:06.786 "w_mbytes_per_sec": 0 00:15:06.786 }, 00:15:06.786 "claimed": true, 00:15:06.786 "claim_type": "exclusive_write", 00:15:06.786 "zoned": false, 00:15:06.786 "supported_io_types": { 00:15:06.786 "read": true, 00:15:06.786 "write": true, 00:15:06.786 "unmap": true, 00:15:06.786 "write_zeroes": true, 00:15:06.786 "flush": true, 00:15:06.786 "reset": true, 00:15:06.786 "compare": false, 00:15:06.786 "compare_and_write": false, 00:15:06.786 "abort": true, 00:15:06.786 "nvme_admin": false, 00:15:06.786 "nvme_io": false 00:15:06.786 }, 00:15:06.786 "memory_domains": [ 00:15:06.786 { 00:15:06.786 "dma_device_id": "system", 00:15:06.786 "dma_device_type": 1 00:15:06.786 }, 00:15:06.786 { 00:15:06.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.786 "dma_device_type": 2 00:15:06.786 } 00:15:06.786 ], 00:15:06.786 "driver_specific": {} 00:15:06.786 } 00:15:06.786 ] 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.786 "name": "Existed_Raid", 00:15:06.786 "uuid": "bd9766a3-0a4d-4d59-bd32-d519fd698223", 00:15:06.786 "strip_size_kb": 0, 00:15:06.786 "state": "configuring", 00:15:06.786 "raid_level": "raid1", 00:15:06.786 "superblock": true, 00:15:06.786 "num_base_bdevs": 3, 00:15:06.786 "num_base_bdevs_discovered": 1, 00:15:06.786 "num_base_bdevs_operational": 3, 00:15:06.786 "base_bdevs_list": [ 00:15:06.786 { 00:15:06.786 "name": "BaseBdev1", 00:15:06.786 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:06.786 "is_configured": true, 00:15:06.786 "data_offset": 2048, 00:15:06.786 "data_size": 63488 00:15:06.786 }, 00:15:06.786 { 00:15:06.786 "name": "BaseBdev2", 00:15:06.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.786 "is_configured": false, 00:15:06.786 "data_offset": 0, 00:15:06.786 "data_size": 0 00:15:06.786 }, 00:15:06.786 { 00:15:06.786 "name": "BaseBdev3", 00:15:06.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.786 "is_configured": false, 00:15:06.786 "data_offset": 0, 00:15:06.786 "data_size": 0 00:15:06.786 } 00:15:06.786 ] 00:15:06.786 }' 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.786 10:10:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.358 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:07.619 [2024-06-10 10:10:29.305417] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:07.619 [2024-06-10 10:10:29.305444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x164f3f0 name Existed_Raid, state configuring 00:15:07.619 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.881 [2024-06-10 10:10:29.497930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:07.881 [2024-06-10 10:10:29.499051] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.881 [2024-06-10 10:10:29.499074] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.881 [2024-06-10 10:10:29.499079] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.881 [2024-06-10 10:10:29.499085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.881 "name": "Existed_Raid", 00:15:07.881 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:07.881 "strip_size_kb": 0, 00:15:07.881 "state": "configuring", 00:15:07.881 "raid_level": "raid1", 00:15:07.881 "superblock": true, 00:15:07.881 "num_base_bdevs": 3, 00:15:07.881 "num_base_bdevs_discovered": 1, 00:15:07.881 "num_base_bdevs_operational": 3, 00:15:07.881 "base_bdevs_list": [ 00:15:07.881 { 00:15:07.881 "name": "BaseBdev1", 00:15:07.881 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:07.881 "is_configured": true, 00:15:07.881 "data_offset": 2048, 00:15:07.881 "data_size": 63488 00:15:07.881 }, 00:15:07.881 { 00:15:07.881 "name": "BaseBdev2", 00:15:07.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.881 "is_configured": false, 00:15:07.881 "data_offset": 0, 00:15:07.881 "data_size": 0 00:15:07.881 }, 00:15:07.881 { 00:15:07.881 "name": "BaseBdev3", 00:15:07.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.881 "is_configured": false, 00:15:07.881 "data_offset": 0, 00:15:07.881 "data_size": 0 00:15:07.881 } 00:15:07.881 ] 00:15:07.881 }' 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.881 10:10:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.452 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:08.713 [2024-06-10 10:10:30.445237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:08.713 BaseBdev2 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:08.713 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:08.973 [ 00:15:08.973 { 00:15:08.973 "name": "BaseBdev2", 00:15:08.973 "aliases": [ 00:15:08.973 "8f81c89b-dcf9-49c7-88f9-f25b957a0bee" 00:15:08.973 ], 00:15:08.973 "product_name": "Malloc disk", 00:15:08.973 "block_size": 512, 00:15:08.973 "num_blocks": 65536, 00:15:08.973 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:08.973 "assigned_rate_limits": { 00:15:08.973 "rw_ios_per_sec": 0, 00:15:08.973 "rw_mbytes_per_sec": 0, 00:15:08.973 "r_mbytes_per_sec": 0, 00:15:08.973 "w_mbytes_per_sec": 0 00:15:08.973 }, 00:15:08.973 "claimed": true, 00:15:08.973 "claim_type": "exclusive_write", 00:15:08.973 "zoned": false, 00:15:08.973 "supported_io_types": { 00:15:08.973 "read": true, 00:15:08.973 "write": true, 00:15:08.973 "unmap": true, 00:15:08.973 "write_zeroes": true, 00:15:08.973 "flush": true, 00:15:08.973 "reset": true, 00:15:08.973 "compare": false, 00:15:08.973 "compare_and_write": false, 00:15:08.973 "abort": true, 00:15:08.973 "nvme_admin": false, 00:15:08.973 "nvme_io": false 00:15:08.973 }, 00:15:08.973 "memory_domains": [ 00:15:08.973 { 00:15:08.973 "dma_device_id": "system", 00:15:08.973 "dma_device_type": 1 00:15:08.973 }, 00:15:08.973 { 00:15:08.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.973 "dma_device_type": 2 00:15:08.973 } 00:15:08.973 ], 00:15:08.973 "driver_specific": {} 00:15:08.973 } 00:15:08.973 ] 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.973 10:10:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.234 10:10:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.234 "name": "Existed_Raid", 00:15:09.234 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:09.234 "strip_size_kb": 0, 00:15:09.234 "state": "configuring", 00:15:09.234 "raid_level": "raid1", 00:15:09.234 "superblock": true, 00:15:09.234 "num_base_bdevs": 3, 00:15:09.234 "num_base_bdevs_discovered": 2, 00:15:09.234 "num_base_bdevs_operational": 3, 00:15:09.234 "base_bdevs_list": [ 00:15:09.234 { 00:15:09.234 "name": "BaseBdev1", 00:15:09.234 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:09.234 "is_configured": true, 00:15:09.234 "data_offset": 2048, 00:15:09.234 "data_size": 63488 00:15:09.234 }, 00:15:09.234 { 00:15:09.234 "name": "BaseBdev2", 00:15:09.234 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:09.234 "is_configured": true, 00:15:09.234 "data_offset": 2048, 00:15:09.234 "data_size": 63488 00:15:09.234 }, 00:15:09.234 { 00:15:09.234 "name": "BaseBdev3", 00:15:09.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.234 "is_configured": false, 00:15:09.234 "data_offset": 0, 00:15:09.234 "data_size": 0 00:15:09.234 } 00:15:09.234 ] 00:15:09.234 }' 00:15:09.234 10:10:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.234 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:09.805 10:10:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:10.066 [2024-06-10 10:10:31.733455] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:10.066 [2024-06-10 10:10:31.733568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16502c0 00:15:10.066 [2024-06-10 10:10:31.733576] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:10.066 [2024-06-10 10:10:31.733708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f4850 00:15:10.066 [2024-06-10 10:10:31.733798] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16502c0 00:15:10.066 [2024-06-10 10:10:31.733804] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16502c0 00:15:10.066 [2024-06-10 10:10:31.733880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:10.066 BaseBdev3 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.066 10:10:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:10.326 [ 00:15:10.326 { 00:15:10.326 "name": "BaseBdev3", 00:15:10.326 "aliases": [ 00:15:10.326 "190142d7-edb0-4df3-8f76-3728f2703e2b" 00:15:10.326 ], 00:15:10.326 "product_name": "Malloc disk", 00:15:10.326 "block_size": 512, 00:15:10.326 "num_blocks": 65536, 00:15:10.326 "uuid": "190142d7-edb0-4df3-8f76-3728f2703e2b", 00:15:10.326 "assigned_rate_limits": { 00:15:10.326 "rw_ios_per_sec": 0, 00:15:10.326 "rw_mbytes_per_sec": 0, 00:15:10.326 "r_mbytes_per_sec": 0, 00:15:10.326 "w_mbytes_per_sec": 0 00:15:10.326 }, 00:15:10.326 "claimed": true, 00:15:10.326 "claim_type": "exclusive_write", 00:15:10.326 "zoned": false, 00:15:10.326 "supported_io_types": { 00:15:10.326 "read": true, 00:15:10.326 "write": true, 00:15:10.326 "unmap": true, 00:15:10.326 "write_zeroes": true, 00:15:10.326 "flush": true, 00:15:10.326 "reset": true, 00:15:10.326 "compare": false, 00:15:10.326 "compare_and_write": false, 00:15:10.326 "abort": true, 00:15:10.326 "nvme_admin": false, 00:15:10.326 "nvme_io": false 00:15:10.327 }, 00:15:10.327 "memory_domains": [ 00:15:10.327 { 00:15:10.327 "dma_device_id": "system", 00:15:10.327 "dma_device_type": 1 00:15:10.327 }, 00:15:10.327 { 00:15:10.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.327 "dma_device_type": 2 00:15:10.327 } 00:15:10.327 ], 00:15:10.327 "driver_specific": {} 00:15:10.327 } 00:15:10.327 ] 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.327 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.587 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.587 "name": "Existed_Raid", 00:15:10.587 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:10.587 "strip_size_kb": 0, 00:15:10.587 "state": "online", 00:15:10.587 "raid_level": "raid1", 00:15:10.587 "superblock": true, 00:15:10.587 "num_base_bdevs": 3, 00:15:10.587 "num_base_bdevs_discovered": 3, 00:15:10.587 "num_base_bdevs_operational": 3, 00:15:10.587 "base_bdevs_list": [ 00:15:10.587 { 00:15:10.587 "name": "BaseBdev1", 00:15:10.587 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:10.587 "is_configured": true, 00:15:10.587 "data_offset": 2048, 00:15:10.587 "data_size": 63488 00:15:10.587 }, 00:15:10.587 { 00:15:10.587 "name": "BaseBdev2", 00:15:10.587 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:10.587 "is_configured": true, 00:15:10.587 "data_offset": 2048, 00:15:10.587 "data_size": 63488 00:15:10.587 }, 00:15:10.587 { 00:15:10.587 "name": "BaseBdev3", 00:15:10.587 "uuid": "190142d7-edb0-4df3-8f76-3728f2703e2b", 00:15:10.587 "is_configured": true, 00:15:10.587 "data_offset": 2048, 00:15:10.587 "data_size": 63488 00:15:10.587 } 00:15:10.587 ] 00:15:10.587 }' 00:15:10.587 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.587 10:10:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:11.157 10:10:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:11.157 [2024-06-10 10:10:33.016921] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:11.418 "name": "Existed_Raid", 00:15:11.418 "aliases": [ 00:15:11.418 "4f5a0461-32c6-4804-9985-51d9ec97976b" 00:15:11.418 ], 00:15:11.418 "product_name": "Raid Volume", 00:15:11.418 "block_size": 512, 00:15:11.418 "num_blocks": 63488, 00:15:11.418 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:11.418 "assigned_rate_limits": { 00:15:11.418 "rw_ios_per_sec": 0, 00:15:11.418 "rw_mbytes_per_sec": 0, 00:15:11.418 "r_mbytes_per_sec": 0, 00:15:11.418 "w_mbytes_per_sec": 0 00:15:11.418 }, 00:15:11.418 "claimed": false, 00:15:11.418 "zoned": false, 00:15:11.418 "supported_io_types": { 00:15:11.418 "read": true, 00:15:11.418 "write": true, 00:15:11.418 "unmap": false, 00:15:11.418 "write_zeroes": true, 00:15:11.418 "flush": false, 00:15:11.418 "reset": true, 00:15:11.418 "compare": false, 00:15:11.418 "compare_and_write": false, 00:15:11.418 "abort": false, 00:15:11.418 "nvme_admin": false, 00:15:11.418 "nvme_io": false 00:15:11.418 }, 00:15:11.418 "memory_domains": [ 00:15:11.418 { 00:15:11.418 "dma_device_id": "system", 00:15:11.418 "dma_device_type": 1 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.418 "dma_device_type": 2 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "system", 00:15:11.418 "dma_device_type": 1 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.418 "dma_device_type": 2 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "system", 00:15:11.418 "dma_device_type": 1 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.418 "dma_device_type": 2 00:15:11.418 } 00:15:11.418 ], 00:15:11.418 "driver_specific": { 00:15:11.418 "raid": { 00:15:11.418 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:11.418 "strip_size_kb": 0, 00:15:11.418 "state": "online", 00:15:11.418 "raid_level": "raid1", 00:15:11.418 "superblock": true, 00:15:11.418 "num_base_bdevs": 3, 00:15:11.418 "num_base_bdevs_discovered": 3, 00:15:11.418 "num_base_bdevs_operational": 3, 00:15:11.418 "base_bdevs_list": [ 00:15:11.418 { 00:15:11.418 "name": "BaseBdev1", 00:15:11.418 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:11.418 "is_configured": true, 00:15:11.418 "data_offset": 2048, 00:15:11.418 "data_size": 63488 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "name": "BaseBdev2", 00:15:11.418 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:11.418 "is_configured": true, 00:15:11.418 "data_offset": 2048, 00:15:11.418 "data_size": 63488 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "name": "BaseBdev3", 00:15:11.418 "uuid": "190142d7-edb0-4df3-8f76-3728f2703e2b", 00:15:11.418 "is_configured": true, 00:15:11.418 "data_offset": 2048, 00:15:11.418 "data_size": 63488 00:15:11.418 } 00:15:11.418 ] 00:15:11.418 } 00:15:11.418 } 00:15:11.418 }' 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:11.418 BaseBdev2 00:15:11.418 BaseBdev3' 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.418 "name": "BaseBdev1", 00:15:11.418 "aliases": [ 00:15:11.418 "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe" 00:15:11.418 ], 00:15:11.418 "product_name": "Malloc disk", 00:15:11.418 "block_size": 512, 00:15:11.418 "num_blocks": 65536, 00:15:11.418 "uuid": "1ccf7ec9-91a4-4f1c-b30f-1a28d743babe", 00:15:11.418 "assigned_rate_limits": { 00:15:11.418 "rw_ios_per_sec": 0, 00:15:11.418 "rw_mbytes_per_sec": 0, 00:15:11.418 "r_mbytes_per_sec": 0, 00:15:11.418 "w_mbytes_per_sec": 0 00:15:11.418 }, 00:15:11.418 "claimed": true, 00:15:11.418 "claim_type": "exclusive_write", 00:15:11.418 "zoned": false, 00:15:11.418 "supported_io_types": { 00:15:11.418 "read": true, 00:15:11.418 "write": true, 00:15:11.418 "unmap": true, 00:15:11.418 "write_zeroes": true, 00:15:11.418 "flush": true, 00:15:11.418 "reset": true, 00:15:11.418 "compare": false, 00:15:11.418 "compare_and_write": false, 00:15:11.418 "abort": true, 00:15:11.418 "nvme_admin": false, 00:15:11.418 "nvme_io": false 00:15:11.418 }, 00:15:11.418 "memory_domains": [ 00:15:11.418 { 00:15:11.418 "dma_device_id": "system", 00:15:11.418 "dma_device_type": 1 00:15:11.418 }, 00:15:11.418 { 00:15:11.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.418 "dma_device_type": 2 00:15:11.418 } 00:15:11.418 ], 00:15:11.418 "driver_specific": {} 00:15:11.418 }' 00:15:11.418 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:11.679 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.939 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.939 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:11.939 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.939 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.939 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.199 "name": "BaseBdev2", 00:15:12.199 "aliases": [ 00:15:12.199 "8f81c89b-dcf9-49c7-88f9-f25b957a0bee" 00:15:12.199 ], 00:15:12.199 "product_name": "Malloc disk", 00:15:12.199 "block_size": 512, 00:15:12.199 "num_blocks": 65536, 00:15:12.199 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:12.199 "assigned_rate_limits": { 00:15:12.199 "rw_ios_per_sec": 0, 00:15:12.199 "rw_mbytes_per_sec": 0, 00:15:12.199 "r_mbytes_per_sec": 0, 00:15:12.199 "w_mbytes_per_sec": 0 00:15:12.199 }, 00:15:12.199 "claimed": true, 00:15:12.199 "claim_type": "exclusive_write", 00:15:12.199 "zoned": false, 00:15:12.199 "supported_io_types": { 00:15:12.199 "read": true, 00:15:12.199 "write": true, 00:15:12.199 "unmap": true, 00:15:12.199 "write_zeroes": true, 00:15:12.199 "flush": true, 00:15:12.199 "reset": true, 00:15:12.199 "compare": false, 00:15:12.199 "compare_and_write": false, 00:15:12.199 "abort": true, 00:15:12.199 "nvme_admin": false, 00:15:12.199 "nvme_io": false 00:15:12.199 }, 00:15:12.199 "memory_domains": [ 00:15:12.199 { 00:15:12.199 "dma_device_id": "system", 00:15:12.199 "dma_device_type": 1 00:15:12.199 }, 00:15:12.199 { 00:15:12.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.199 "dma_device_type": 2 00:15:12.199 } 00:15:12.199 ], 00:15:12.199 "driver_specific": {} 00:15:12.199 }' 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.199 10:10:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.199 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.199 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.200 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.460 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.460 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.460 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.460 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:12.460 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.720 "name": "BaseBdev3", 00:15:12.720 "aliases": [ 00:15:12.720 "190142d7-edb0-4df3-8f76-3728f2703e2b" 00:15:12.720 ], 00:15:12.720 "product_name": "Malloc disk", 00:15:12.720 "block_size": 512, 00:15:12.720 "num_blocks": 65536, 00:15:12.720 "uuid": "190142d7-edb0-4df3-8f76-3728f2703e2b", 00:15:12.720 "assigned_rate_limits": { 00:15:12.720 "rw_ios_per_sec": 0, 00:15:12.720 "rw_mbytes_per_sec": 0, 00:15:12.720 "r_mbytes_per_sec": 0, 00:15:12.720 "w_mbytes_per_sec": 0 00:15:12.720 }, 00:15:12.720 "claimed": true, 00:15:12.720 "claim_type": "exclusive_write", 00:15:12.720 "zoned": false, 00:15:12.720 "supported_io_types": { 00:15:12.720 "read": true, 00:15:12.720 "write": true, 00:15:12.720 "unmap": true, 00:15:12.720 "write_zeroes": true, 00:15:12.720 "flush": true, 00:15:12.720 "reset": true, 00:15:12.720 "compare": false, 00:15:12.720 "compare_and_write": false, 00:15:12.720 "abort": true, 00:15:12.720 "nvme_admin": false, 00:15:12.720 "nvme_io": false 00:15:12.720 }, 00:15:12.720 "memory_domains": [ 00:15:12.720 { 00:15:12.720 "dma_device_id": "system", 00:15:12.720 "dma_device_type": 1 00:15:12.720 }, 00:15:12.720 { 00:15:12.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.720 "dma_device_type": 2 00:15:12.720 } 00:15:12.720 ], 00:15:12.720 "driver_specific": {} 00:15:12.720 }' 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.720 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.981 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.981 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.981 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.981 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.981 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:13.241 [2024-06-10 10:10:34.861441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.241 10:10:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.241 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.241 "name": "Existed_Raid", 00:15:13.241 "uuid": "4f5a0461-32c6-4804-9985-51d9ec97976b", 00:15:13.241 "strip_size_kb": 0, 00:15:13.241 "state": "online", 00:15:13.241 "raid_level": "raid1", 00:15:13.241 "superblock": true, 00:15:13.241 "num_base_bdevs": 3, 00:15:13.241 "num_base_bdevs_discovered": 2, 00:15:13.241 "num_base_bdevs_operational": 2, 00:15:13.241 "base_bdevs_list": [ 00:15:13.241 { 00:15:13.241 "name": null, 00:15:13.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.241 "is_configured": false, 00:15:13.241 "data_offset": 2048, 00:15:13.241 "data_size": 63488 00:15:13.241 }, 00:15:13.241 { 00:15:13.241 "name": "BaseBdev2", 00:15:13.241 "uuid": "8f81c89b-dcf9-49c7-88f9-f25b957a0bee", 00:15:13.241 "is_configured": true, 00:15:13.241 "data_offset": 2048, 00:15:13.241 "data_size": 63488 00:15:13.241 }, 00:15:13.241 { 00:15:13.241 "name": "BaseBdev3", 00:15:13.241 "uuid": "190142d7-edb0-4df3-8f76-3728f2703e2b", 00:15:13.241 "is_configured": true, 00:15:13.241 "data_offset": 2048, 00:15:13.241 "data_size": 63488 00:15:13.241 } 00:15:13.241 ] 00:15:13.241 }' 00:15:13.241 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.241 10:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.811 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:13.811 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:13.811 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.811 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:14.074 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:14.074 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:14.074 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:14.383 [2024-06-10 10:10:35.972262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:14.383 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:14.383 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.383 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.383 10:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:14.383 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:14.383 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:14.383 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:14.657 [2024-06-10 10:10:36.355083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:14.657 [2024-06-10 10:10:36.355146] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:14.657 [2024-06-10 10:10:36.361101] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:14.657 [2024-06-10 10:10:36.361125] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:14.658 [2024-06-10 10:10:36.361131] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16502c0 name Existed_Raid, state offline 00:15:14.658 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:14.658 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.658 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.658 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:14.918 BaseBdev2 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:14.918 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.178 10:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:15.437 [ 00:15:15.437 { 00:15:15.437 "name": "BaseBdev2", 00:15:15.437 "aliases": [ 00:15:15.437 "45a44ec7-ae42-4701-a49c-e2118c5525e9" 00:15:15.437 ], 00:15:15.437 "product_name": "Malloc disk", 00:15:15.437 "block_size": 512, 00:15:15.437 "num_blocks": 65536, 00:15:15.437 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:15.437 "assigned_rate_limits": { 00:15:15.437 "rw_ios_per_sec": 0, 00:15:15.437 "rw_mbytes_per_sec": 0, 00:15:15.437 "r_mbytes_per_sec": 0, 00:15:15.437 "w_mbytes_per_sec": 0 00:15:15.437 }, 00:15:15.437 "claimed": false, 00:15:15.437 "zoned": false, 00:15:15.437 "supported_io_types": { 00:15:15.437 "read": true, 00:15:15.437 "write": true, 00:15:15.437 "unmap": true, 00:15:15.437 "write_zeroes": true, 00:15:15.437 "flush": true, 00:15:15.437 "reset": true, 00:15:15.437 "compare": false, 00:15:15.437 "compare_and_write": false, 00:15:15.437 "abort": true, 00:15:15.437 "nvme_admin": false, 00:15:15.437 "nvme_io": false 00:15:15.437 }, 00:15:15.437 "memory_domains": [ 00:15:15.437 { 00:15:15.437 "dma_device_id": "system", 00:15:15.437 "dma_device_type": 1 00:15:15.437 }, 00:15:15.437 { 00:15:15.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.437 "dma_device_type": 2 00:15:15.437 } 00:15:15.437 ], 00:15:15.437 "driver_specific": {} 00:15:15.437 } 00:15:15.437 ] 00:15:15.437 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:15.437 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:15.437 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:15.437 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:15.437 BaseBdev3 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.697 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:15.957 [ 00:15:15.957 { 00:15:15.957 "name": "BaseBdev3", 00:15:15.957 "aliases": [ 00:15:15.957 "012a52c0-8e2a-48d3-b8bf-61ebda78f218" 00:15:15.957 ], 00:15:15.957 "product_name": "Malloc disk", 00:15:15.957 "block_size": 512, 00:15:15.957 "num_blocks": 65536, 00:15:15.957 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:15.957 "assigned_rate_limits": { 00:15:15.957 "rw_ios_per_sec": 0, 00:15:15.957 "rw_mbytes_per_sec": 0, 00:15:15.957 "r_mbytes_per_sec": 0, 00:15:15.957 "w_mbytes_per_sec": 0 00:15:15.957 }, 00:15:15.957 "claimed": false, 00:15:15.957 "zoned": false, 00:15:15.957 "supported_io_types": { 00:15:15.957 "read": true, 00:15:15.957 "write": true, 00:15:15.957 "unmap": true, 00:15:15.957 "write_zeroes": true, 00:15:15.957 "flush": true, 00:15:15.957 "reset": true, 00:15:15.957 "compare": false, 00:15:15.957 "compare_and_write": false, 00:15:15.957 "abort": true, 00:15:15.957 "nvme_admin": false, 00:15:15.957 "nvme_io": false 00:15:15.957 }, 00:15:15.957 "memory_domains": [ 00:15:15.957 { 00:15:15.957 "dma_device_id": "system", 00:15:15.957 "dma_device_type": 1 00:15:15.957 }, 00:15:15.957 { 00:15:15.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.957 "dma_device_type": 2 00:15:15.957 } 00:15:15.957 ], 00:15:15.957 "driver_specific": {} 00:15:15.957 } 00:15:15.957 ] 00:15:15.957 10:10:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:15.957 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:15.957 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:15.957 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:16.217 [2024-06-10 10:10:37.838734] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:16.217 [2024-06-10 10:10:37.838762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:16.217 [2024-06-10 10:10:37.838773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:16.217 [2024-06-10 10:10:37.839780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.217 10:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.217 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.217 "name": "Existed_Raid", 00:15:16.217 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:16.217 "strip_size_kb": 0, 00:15:16.217 "state": "configuring", 00:15:16.217 "raid_level": "raid1", 00:15:16.217 "superblock": true, 00:15:16.217 "num_base_bdevs": 3, 00:15:16.217 "num_base_bdevs_discovered": 2, 00:15:16.217 "num_base_bdevs_operational": 3, 00:15:16.218 "base_bdevs_list": [ 00:15:16.218 { 00:15:16.218 "name": "BaseBdev1", 00:15:16.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.218 "is_configured": false, 00:15:16.218 "data_offset": 0, 00:15:16.218 "data_size": 0 00:15:16.218 }, 00:15:16.218 { 00:15:16.218 "name": "BaseBdev2", 00:15:16.218 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:16.218 "is_configured": true, 00:15:16.218 "data_offset": 2048, 00:15:16.218 "data_size": 63488 00:15:16.218 }, 00:15:16.218 { 00:15:16.218 "name": "BaseBdev3", 00:15:16.218 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:16.218 "is_configured": true, 00:15:16.218 "data_offset": 2048, 00:15:16.218 "data_size": 63488 00:15:16.218 } 00:15:16.218 ] 00:15:16.218 }' 00:15:16.218 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.218 10:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.789 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:17.049 [2024-06-10 10:10:38.728958] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.049 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.310 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.310 "name": "Existed_Raid", 00:15:17.310 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:17.310 "strip_size_kb": 0, 00:15:17.310 "state": "configuring", 00:15:17.310 "raid_level": "raid1", 00:15:17.310 "superblock": true, 00:15:17.310 "num_base_bdevs": 3, 00:15:17.310 "num_base_bdevs_discovered": 1, 00:15:17.310 "num_base_bdevs_operational": 3, 00:15:17.310 "base_bdevs_list": [ 00:15:17.310 { 00:15:17.310 "name": "BaseBdev1", 00:15:17.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.310 "is_configured": false, 00:15:17.310 "data_offset": 0, 00:15:17.310 "data_size": 0 00:15:17.310 }, 00:15:17.310 { 00:15:17.310 "name": null, 00:15:17.310 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:17.310 "is_configured": false, 00:15:17.310 "data_offset": 2048, 00:15:17.310 "data_size": 63488 00:15:17.310 }, 00:15:17.310 { 00:15:17.310 "name": "BaseBdev3", 00:15:17.310 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:17.310 "is_configured": true, 00:15:17.310 "data_offset": 2048, 00:15:17.310 "data_size": 63488 00:15:17.310 } 00:15:17.310 ] 00:15:17.310 }' 00:15:17.310 10:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.310 10:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.880 10:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.880 10:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:17.880 10:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:17.880 10:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:18.141 [2024-06-10 10:10:39.856748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:18.141 BaseBdev1 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:18.141 10:10:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:18.401 [ 00:15:18.401 { 00:15:18.401 "name": "BaseBdev1", 00:15:18.401 "aliases": [ 00:15:18.401 "51438a23-8f2a-4268-a363-19d333a95f62" 00:15:18.401 ], 00:15:18.401 "product_name": "Malloc disk", 00:15:18.401 "block_size": 512, 00:15:18.401 "num_blocks": 65536, 00:15:18.401 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:18.401 "assigned_rate_limits": { 00:15:18.401 "rw_ios_per_sec": 0, 00:15:18.401 "rw_mbytes_per_sec": 0, 00:15:18.401 "r_mbytes_per_sec": 0, 00:15:18.401 "w_mbytes_per_sec": 0 00:15:18.401 }, 00:15:18.401 "claimed": true, 00:15:18.401 "claim_type": "exclusive_write", 00:15:18.401 "zoned": false, 00:15:18.401 "supported_io_types": { 00:15:18.401 "read": true, 00:15:18.401 "write": true, 00:15:18.401 "unmap": true, 00:15:18.401 "write_zeroes": true, 00:15:18.401 "flush": true, 00:15:18.401 "reset": true, 00:15:18.401 "compare": false, 00:15:18.401 "compare_and_write": false, 00:15:18.401 "abort": true, 00:15:18.401 "nvme_admin": false, 00:15:18.401 "nvme_io": false 00:15:18.401 }, 00:15:18.401 "memory_domains": [ 00:15:18.401 { 00:15:18.401 "dma_device_id": "system", 00:15:18.401 "dma_device_type": 1 00:15:18.401 }, 00:15:18.401 { 00:15:18.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.401 "dma_device_type": 2 00:15:18.401 } 00:15:18.401 ], 00:15:18.401 "driver_specific": {} 00:15:18.401 } 00:15:18.401 ] 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.401 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.661 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.661 "name": "Existed_Raid", 00:15:18.661 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:18.661 "strip_size_kb": 0, 00:15:18.661 "state": "configuring", 00:15:18.661 "raid_level": "raid1", 00:15:18.661 "superblock": true, 00:15:18.661 "num_base_bdevs": 3, 00:15:18.661 "num_base_bdevs_discovered": 2, 00:15:18.661 "num_base_bdevs_operational": 3, 00:15:18.661 "base_bdevs_list": [ 00:15:18.661 { 00:15:18.661 "name": "BaseBdev1", 00:15:18.661 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:18.661 "is_configured": true, 00:15:18.661 "data_offset": 2048, 00:15:18.661 "data_size": 63488 00:15:18.661 }, 00:15:18.661 { 00:15:18.661 "name": null, 00:15:18.661 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:18.661 "is_configured": false, 00:15:18.661 "data_offset": 2048, 00:15:18.661 "data_size": 63488 00:15:18.661 }, 00:15:18.661 { 00:15:18.661 "name": "BaseBdev3", 00:15:18.661 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:18.661 "is_configured": true, 00:15:18.661 "data_offset": 2048, 00:15:18.661 "data_size": 63488 00:15:18.661 } 00:15:18.661 ] 00:15:18.661 }' 00:15:18.662 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.662 10:10:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.231 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.231 10:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:19.491 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:19.491 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:19.491 [2024-06-10 10:10:41.348541] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.752 "name": "Existed_Raid", 00:15:19.752 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:19.752 "strip_size_kb": 0, 00:15:19.752 "state": "configuring", 00:15:19.752 "raid_level": "raid1", 00:15:19.752 "superblock": true, 00:15:19.752 "num_base_bdevs": 3, 00:15:19.752 "num_base_bdevs_discovered": 1, 00:15:19.752 "num_base_bdevs_operational": 3, 00:15:19.752 "base_bdevs_list": [ 00:15:19.752 { 00:15:19.752 "name": "BaseBdev1", 00:15:19.752 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:19.752 "is_configured": true, 00:15:19.752 "data_offset": 2048, 00:15:19.752 "data_size": 63488 00:15:19.752 }, 00:15:19.752 { 00:15:19.752 "name": null, 00:15:19.752 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:19.752 "is_configured": false, 00:15:19.752 "data_offset": 2048, 00:15:19.752 "data_size": 63488 00:15:19.752 }, 00:15:19.752 { 00:15:19.752 "name": null, 00:15:19.752 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:19.752 "is_configured": false, 00:15:19.752 "data_offset": 2048, 00:15:19.752 "data_size": 63488 00:15:19.752 } 00:15:19.752 ] 00:15:19.752 }' 00:15:19.752 10:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.753 10:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.392 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.392 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:20.392 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:20.392 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:20.652 [2024-06-10 10:10:42.431294] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.652 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.912 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.912 "name": "Existed_Raid", 00:15:20.912 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:20.912 "strip_size_kb": 0, 00:15:20.912 "state": "configuring", 00:15:20.912 "raid_level": "raid1", 00:15:20.912 "superblock": true, 00:15:20.912 "num_base_bdevs": 3, 00:15:20.912 "num_base_bdevs_discovered": 2, 00:15:20.912 "num_base_bdevs_operational": 3, 00:15:20.912 "base_bdevs_list": [ 00:15:20.912 { 00:15:20.912 "name": "BaseBdev1", 00:15:20.912 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:20.912 "is_configured": true, 00:15:20.912 "data_offset": 2048, 00:15:20.912 "data_size": 63488 00:15:20.912 }, 00:15:20.912 { 00:15:20.912 "name": null, 00:15:20.912 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:20.912 "is_configured": false, 00:15:20.912 "data_offset": 2048, 00:15:20.912 "data_size": 63488 00:15:20.912 }, 00:15:20.912 { 00:15:20.912 "name": "BaseBdev3", 00:15:20.912 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:20.912 "is_configured": true, 00:15:20.912 "data_offset": 2048, 00:15:20.912 "data_size": 63488 00:15:20.912 } 00:15:20.912 ] 00:15:20.912 }' 00:15:20.912 10:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.912 10:10:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.482 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.482 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:21.482 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:21.482 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:21.742 [2024-06-10 10:10:43.518069] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.742 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.002 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.002 "name": "Existed_Raid", 00:15:22.002 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:22.002 "strip_size_kb": 0, 00:15:22.002 "state": "configuring", 00:15:22.002 "raid_level": "raid1", 00:15:22.002 "superblock": true, 00:15:22.002 "num_base_bdevs": 3, 00:15:22.002 "num_base_bdevs_discovered": 1, 00:15:22.002 "num_base_bdevs_operational": 3, 00:15:22.002 "base_bdevs_list": [ 00:15:22.002 { 00:15:22.002 "name": null, 00:15:22.002 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:22.002 "is_configured": false, 00:15:22.002 "data_offset": 2048, 00:15:22.002 "data_size": 63488 00:15:22.002 }, 00:15:22.002 { 00:15:22.002 "name": null, 00:15:22.002 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:22.002 "is_configured": false, 00:15:22.002 "data_offset": 2048, 00:15:22.002 "data_size": 63488 00:15:22.002 }, 00:15:22.002 { 00:15:22.002 "name": "BaseBdev3", 00:15:22.002 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:22.002 "is_configured": true, 00:15:22.002 "data_offset": 2048, 00:15:22.002 "data_size": 63488 00:15:22.002 } 00:15:22.002 ] 00:15:22.002 }' 00:15:22.002 10:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.002 10:10:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.572 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.572 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:22.572 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:22.572 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:22.832 [2024-06-10 10:10:44.614573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.832 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.092 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.092 "name": "Existed_Raid", 00:15:23.092 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:23.092 "strip_size_kb": 0, 00:15:23.092 "state": "configuring", 00:15:23.092 "raid_level": "raid1", 00:15:23.092 "superblock": true, 00:15:23.092 "num_base_bdevs": 3, 00:15:23.092 "num_base_bdevs_discovered": 2, 00:15:23.092 "num_base_bdevs_operational": 3, 00:15:23.092 "base_bdevs_list": [ 00:15:23.092 { 00:15:23.092 "name": null, 00:15:23.092 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:23.092 "is_configured": false, 00:15:23.092 "data_offset": 2048, 00:15:23.092 "data_size": 63488 00:15:23.092 }, 00:15:23.092 { 00:15:23.092 "name": "BaseBdev2", 00:15:23.092 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:23.092 "is_configured": true, 00:15:23.092 "data_offset": 2048, 00:15:23.092 "data_size": 63488 00:15:23.092 }, 00:15:23.092 { 00:15:23.092 "name": "BaseBdev3", 00:15:23.092 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:23.092 "is_configured": true, 00:15:23.092 "data_offset": 2048, 00:15:23.092 "data_size": 63488 00:15:23.092 } 00:15:23.092 ] 00:15:23.092 }' 00:15:23.092 10:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.092 10:10:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.661 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.661 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:23.920 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:23.920 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.920 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:23.920 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 51438a23-8f2a-4268-a363-19d333a95f62 00:15:24.180 [2024-06-10 10:10:45.942605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:24.180 [2024-06-10 10:10:45.942712] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f3e80 00:15:24.180 [2024-06-10 10:10:45.942719] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:24.180 [2024-06-10 10:10:45.942864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f4ed0 00:15:24.180 [2024-06-10 10:10:45.942961] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f3e80 00:15:24.180 [2024-06-10 10:10:45.942966] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17f3e80 00:15:24.180 [2024-06-10 10:10:45.943034] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.180 NewBaseBdev 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:24.180 10:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.440 10:10:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:24.699 [ 00:15:24.699 { 00:15:24.699 "name": "NewBaseBdev", 00:15:24.699 "aliases": [ 00:15:24.699 "51438a23-8f2a-4268-a363-19d333a95f62" 00:15:24.699 ], 00:15:24.699 "product_name": "Malloc disk", 00:15:24.699 "block_size": 512, 00:15:24.700 "num_blocks": 65536, 00:15:24.700 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:24.700 "assigned_rate_limits": { 00:15:24.700 "rw_ios_per_sec": 0, 00:15:24.700 "rw_mbytes_per_sec": 0, 00:15:24.700 "r_mbytes_per_sec": 0, 00:15:24.700 "w_mbytes_per_sec": 0 00:15:24.700 }, 00:15:24.700 "claimed": true, 00:15:24.700 "claim_type": "exclusive_write", 00:15:24.700 "zoned": false, 00:15:24.700 "supported_io_types": { 00:15:24.700 "read": true, 00:15:24.700 "write": true, 00:15:24.700 "unmap": true, 00:15:24.700 "write_zeroes": true, 00:15:24.700 "flush": true, 00:15:24.700 "reset": true, 00:15:24.700 "compare": false, 00:15:24.700 "compare_and_write": false, 00:15:24.700 "abort": true, 00:15:24.700 "nvme_admin": false, 00:15:24.700 "nvme_io": false 00:15:24.700 }, 00:15:24.700 "memory_domains": [ 00:15:24.700 { 00:15:24.700 "dma_device_id": "system", 00:15:24.700 "dma_device_type": 1 00:15:24.700 }, 00:15:24.700 { 00:15:24.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.700 "dma_device_type": 2 00:15:24.700 } 00:15:24.700 ], 00:15:24.700 "driver_specific": {} 00:15:24.700 } 00:15:24.700 ] 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.700 "name": "Existed_Raid", 00:15:24.700 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:24.700 "strip_size_kb": 0, 00:15:24.700 "state": "online", 00:15:24.700 "raid_level": "raid1", 00:15:24.700 "superblock": true, 00:15:24.700 "num_base_bdevs": 3, 00:15:24.700 "num_base_bdevs_discovered": 3, 00:15:24.700 "num_base_bdevs_operational": 3, 00:15:24.700 "base_bdevs_list": [ 00:15:24.700 { 00:15:24.700 "name": "NewBaseBdev", 00:15:24.700 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:24.700 "is_configured": true, 00:15:24.700 "data_offset": 2048, 00:15:24.700 "data_size": 63488 00:15:24.700 }, 00:15:24.700 { 00:15:24.700 "name": "BaseBdev2", 00:15:24.700 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:24.700 "is_configured": true, 00:15:24.700 "data_offset": 2048, 00:15:24.700 "data_size": 63488 00:15:24.700 }, 00:15:24.700 { 00:15:24.700 "name": "BaseBdev3", 00:15:24.700 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:24.700 "is_configured": true, 00:15:24.700 "data_offset": 2048, 00:15:24.700 "data_size": 63488 00:15:24.700 } 00:15:24.700 ] 00:15:24.700 }' 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.700 10:10:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:25.270 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:25.530 [2024-06-10 10:10:47.258149] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:25.530 "name": "Existed_Raid", 00:15:25.530 "aliases": [ 00:15:25.530 "5e6e22fe-1820-4b27-a52a-d2ff18b08b45" 00:15:25.530 ], 00:15:25.530 "product_name": "Raid Volume", 00:15:25.530 "block_size": 512, 00:15:25.530 "num_blocks": 63488, 00:15:25.530 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:25.530 "assigned_rate_limits": { 00:15:25.530 "rw_ios_per_sec": 0, 00:15:25.530 "rw_mbytes_per_sec": 0, 00:15:25.530 "r_mbytes_per_sec": 0, 00:15:25.530 "w_mbytes_per_sec": 0 00:15:25.530 }, 00:15:25.530 "claimed": false, 00:15:25.530 "zoned": false, 00:15:25.530 "supported_io_types": { 00:15:25.530 "read": true, 00:15:25.530 "write": true, 00:15:25.530 "unmap": false, 00:15:25.530 "write_zeroes": true, 00:15:25.530 "flush": false, 00:15:25.530 "reset": true, 00:15:25.530 "compare": false, 00:15:25.530 "compare_and_write": false, 00:15:25.530 "abort": false, 00:15:25.530 "nvme_admin": false, 00:15:25.530 "nvme_io": false 00:15:25.530 }, 00:15:25.530 "memory_domains": [ 00:15:25.530 { 00:15:25.530 "dma_device_id": "system", 00:15:25.530 "dma_device_type": 1 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.530 "dma_device_type": 2 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "dma_device_id": "system", 00:15:25.530 "dma_device_type": 1 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.530 "dma_device_type": 2 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "dma_device_id": "system", 00:15:25.530 "dma_device_type": 1 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.530 "dma_device_type": 2 00:15:25.530 } 00:15:25.530 ], 00:15:25.530 "driver_specific": { 00:15:25.530 "raid": { 00:15:25.530 "uuid": "5e6e22fe-1820-4b27-a52a-d2ff18b08b45", 00:15:25.530 "strip_size_kb": 0, 00:15:25.530 "state": "online", 00:15:25.530 "raid_level": "raid1", 00:15:25.530 "superblock": true, 00:15:25.530 "num_base_bdevs": 3, 00:15:25.530 "num_base_bdevs_discovered": 3, 00:15:25.530 "num_base_bdevs_operational": 3, 00:15:25.530 "base_bdevs_list": [ 00:15:25.530 { 00:15:25.530 "name": "NewBaseBdev", 00:15:25.530 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:25.530 "is_configured": true, 00:15:25.530 "data_offset": 2048, 00:15:25.530 "data_size": 63488 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "name": "BaseBdev2", 00:15:25.530 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:25.530 "is_configured": true, 00:15:25.530 "data_offset": 2048, 00:15:25.530 "data_size": 63488 00:15:25.530 }, 00:15:25.530 { 00:15:25.530 "name": "BaseBdev3", 00:15:25.530 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:25.530 "is_configured": true, 00:15:25.530 "data_offset": 2048, 00:15:25.530 "data_size": 63488 00:15:25.530 } 00:15:25.530 ] 00:15:25.530 } 00:15:25.530 } 00:15:25.530 }' 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:25.530 BaseBdev2 00:15:25.530 BaseBdev3' 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.530 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.791 "name": "NewBaseBdev", 00:15:25.791 "aliases": [ 00:15:25.791 "51438a23-8f2a-4268-a363-19d333a95f62" 00:15:25.791 ], 00:15:25.791 "product_name": "Malloc disk", 00:15:25.791 "block_size": 512, 00:15:25.791 "num_blocks": 65536, 00:15:25.791 "uuid": "51438a23-8f2a-4268-a363-19d333a95f62", 00:15:25.791 "assigned_rate_limits": { 00:15:25.791 "rw_ios_per_sec": 0, 00:15:25.791 "rw_mbytes_per_sec": 0, 00:15:25.791 "r_mbytes_per_sec": 0, 00:15:25.791 "w_mbytes_per_sec": 0 00:15:25.791 }, 00:15:25.791 "claimed": true, 00:15:25.791 "claim_type": "exclusive_write", 00:15:25.791 "zoned": false, 00:15:25.791 "supported_io_types": { 00:15:25.791 "read": true, 00:15:25.791 "write": true, 00:15:25.791 "unmap": true, 00:15:25.791 "write_zeroes": true, 00:15:25.791 "flush": true, 00:15:25.791 "reset": true, 00:15:25.791 "compare": false, 00:15:25.791 "compare_and_write": false, 00:15:25.791 "abort": true, 00:15:25.791 "nvme_admin": false, 00:15:25.791 "nvme_io": false 00:15:25.791 }, 00:15:25.791 "memory_domains": [ 00:15:25.791 { 00:15:25.791 "dma_device_id": "system", 00:15:25.791 "dma_device_type": 1 00:15:25.791 }, 00:15:25.791 { 00:15:25.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.791 "dma_device_type": 2 00:15:25.791 } 00:15:25.791 ], 00:15:25.791 "driver_specific": {} 00:15:25.791 }' 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.791 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.051 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.051 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.051 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.051 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:26.052 10:10:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.312 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.312 "name": "BaseBdev2", 00:15:26.312 "aliases": [ 00:15:26.312 "45a44ec7-ae42-4701-a49c-e2118c5525e9" 00:15:26.312 ], 00:15:26.312 "product_name": "Malloc disk", 00:15:26.312 "block_size": 512, 00:15:26.312 "num_blocks": 65536, 00:15:26.312 "uuid": "45a44ec7-ae42-4701-a49c-e2118c5525e9", 00:15:26.312 "assigned_rate_limits": { 00:15:26.312 "rw_ios_per_sec": 0, 00:15:26.312 "rw_mbytes_per_sec": 0, 00:15:26.312 "r_mbytes_per_sec": 0, 00:15:26.312 "w_mbytes_per_sec": 0 00:15:26.312 }, 00:15:26.312 "claimed": true, 00:15:26.312 "claim_type": "exclusive_write", 00:15:26.312 "zoned": false, 00:15:26.312 "supported_io_types": { 00:15:26.312 "read": true, 00:15:26.312 "write": true, 00:15:26.312 "unmap": true, 00:15:26.312 "write_zeroes": true, 00:15:26.312 "flush": true, 00:15:26.312 "reset": true, 00:15:26.312 "compare": false, 00:15:26.312 "compare_and_write": false, 00:15:26.312 "abort": true, 00:15:26.312 "nvme_admin": false, 00:15:26.312 "nvme_io": false 00:15:26.312 }, 00:15:26.312 "memory_domains": [ 00:15:26.312 { 00:15:26.312 "dma_device_id": "system", 00:15:26.312 "dma_device_type": 1 00:15:26.312 }, 00:15:26.312 { 00:15:26.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.312 "dma_device_type": 2 00:15:26.312 } 00:15:26.312 ], 00:15:26.312 "driver_specific": {} 00:15:26.312 }' 00:15:26.312 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.312 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.312 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.312 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.572 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:26.832 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.832 "name": "BaseBdev3", 00:15:26.832 "aliases": [ 00:15:26.832 "012a52c0-8e2a-48d3-b8bf-61ebda78f218" 00:15:26.832 ], 00:15:26.832 "product_name": "Malloc disk", 00:15:26.832 "block_size": 512, 00:15:26.832 "num_blocks": 65536, 00:15:26.832 "uuid": "012a52c0-8e2a-48d3-b8bf-61ebda78f218", 00:15:26.832 "assigned_rate_limits": { 00:15:26.832 "rw_ios_per_sec": 0, 00:15:26.832 "rw_mbytes_per_sec": 0, 00:15:26.832 "r_mbytes_per_sec": 0, 00:15:26.832 "w_mbytes_per_sec": 0 00:15:26.832 }, 00:15:26.832 "claimed": true, 00:15:26.832 "claim_type": "exclusive_write", 00:15:26.832 "zoned": false, 00:15:26.832 "supported_io_types": { 00:15:26.832 "read": true, 00:15:26.832 "write": true, 00:15:26.832 "unmap": true, 00:15:26.832 "write_zeroes": true, 00:15:26.832 "flush": true, 00:15:26.832 "reset": true, 00:15:26.832 "compare": false, 00:15:26.832 "compare_and_write": false, 00:15:26.832 "abort": true, 00:15:26.832 "nvme_admin": false, 00:15:26.832 "nvme_io": false 00:15:26.832 }, 00:15:26.832 "memory_domains": [ 00:15:26.832 { 00:15:26.832 "dma_device_id": "system", 00:15:26.832 "dma_device_type": 1 00:15:26.832 }, 00:15:26.832 { 00:15:26.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.832 "dma_device_type": 2 00:15:26.832 } 00:15:26.832 ], 00:15:26.832 "driver_specific": {} 00:15:26.832 }' 00:15:26.832 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.832 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.092 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.353 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.353 10:10:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:27.353 [2024-06-10 10:10:49.130803] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:27.353 [2024-06-10 10:10:49.130820] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:27.353 [2024-06-10 10:10:49.130858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:27.353 [2024-06-10 10:10:49.131060] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:27.353 [2024-06-10 10:10:49.131067] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f3e80 name Existed_Raid, state offline 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1011027 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1011027 ']' 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1011027 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1011027 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1011027' 00:15:27.353 killing process with pid 1011027 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1011027 00:15:27.353 [2024-06-10 10:10:49.189885] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:27.353 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1011027 00:15:27.353 [2024-06-10 10:10:49.204488] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:27.614 10:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:27.614 00:15:27.614 real 0m23.614s 00:15:27.614 user 0m44.340s 00:15:27.614 sys 0m3.468s 00:15:27.614 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:27.614 10:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.614 ************************************ 00:15:27.614 END TEST raid_state_function_test_sb 00:15:27.614 ************************************ 00:15:27.614 10:10:49 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:15:27.614 10:10:49 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:15:27.614 10:10:49 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:27.614 10:10:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:27.614 ************************************ 00:15:27.614 START TEST raid_superblock_test 00:15:27.614 ************************************ 00:15:27.614 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 3 00:15:27.614 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1015609 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1015609 /var/tmp/spdk-raid.sock 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1015609 ']' 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:27.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:27.615 10:10:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.615 [2024-06-10 10:10:49.454238] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:15:27.615 [2024-06-10 10:10:49.454285] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1015609 ] 00:15:27.876 [2024-06-10 10:10:49.544260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:27.876 [2024-06-10 10:10:49.609680] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:27.876 [2024-06-10 10:10:49.648730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:27.876 [2024-06-10 10:10:49.648753] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:28.446 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:28.707 malloc1 00:15:28.707 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:28.967 [2024-06-10 10:10:50.667329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:28.968 [2024-06-10 10:10:50.667364] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.968 [2024-06-10 10:10:50.667375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b76990 00:15:28.968 [2024-06-10 10:10:50.667382] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.968 [2024-06-10 10:10:50.668665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.968 [2024-06-10 10:10:50.668685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:28.968 pt1 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:28.968 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:29.228 malloc2 00:15:29.228 10:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:29.228 [2024-06-10 10:10:51.038118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:29.228 [2024-06-10 10:10:51.038151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:29.228 [2024-06-10 10:10:51.038163] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b774e0 00:15:29.228 [2024-06-10 10:10:51.038169] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:29.228 [2024-06-10 10:10:51.039355] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:29.228 [2024-06-10 10:10:51.039373] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:29.228 pt2 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:29.228 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:29.488 malloc3 00:15:29.489 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:29.749 [2024-06-10 10:10:51.408614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:29.749 [2024-06-10 10:10:51.408644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:29.749 [2024-06-10 10:10:51.408654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d234e0 00:15:29.749 [2024-06-10 10:10:51.408660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:29.749 [2024-06-10 10:10:51.409818] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:29.749 [2024-06-10 10:10:51.409841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:29.749 pt3 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:29.749 [2024-06-10 10:10:51.593100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:29.749 [2024-06-10 10:10:51.594099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:29.749 [2024-06-10 10:10:51.594141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:29.749 [2024-06-10 10:10:51.594257] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d24830 00:15:29.749 [2024-06-10 10:10:51.594264] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:29.749 [2024-06-10 10:10:51.594414] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1842dc0 00:15:29.749 [2024-06-10 10:10:51.594524] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d24830 00:15:29.749 [2024-06-10 10:10:51.594530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d24830 00:15:29.749 [2024-06-10 10:10:51.594599] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.749 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.010 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:30.010 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.010 "name": "raid_bdev1", 00:15:30.010 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:30.010 "strip_size_kb": 0, 00:15:30.010 "state": "online", 00:15:30.010 "raid_level": "raid1", 00:15:30.010 "superblock": true, 00:15:30.010 "num_base_bdevs": 3, 00:15:30.010 "num_base_bdevs_discovered": 3, 00:15:30.010 "num_base_bdevs_operational": 3, 00:15:30.010 "base_bdevs_list": [ 00:15:30.010 { 00:15:30.010 "name": "pt1", 00:15:30.010 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:30.010 "is_configured": true, 00:15:30.010 "data_offset": 2048, 00:15:30.010 "data_size": 63488 00:15:30.010 }, 00:15:30.010 { 00:15:30.010 "name": "pt2", 00:15:30.010 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:30.010 "is_configured": true, 00:15:30.010 "data_offset": 2048, 00:15:30.010 "data_size": 63488 00:15:30.010 }, 00:15:30.010 { 00:15:30.010 "name": "pt3", 00:15:30.010 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:30.010 "is_configured": true, 00:15:30.010 "data_offset": 2048, 00:15:30.010 "data_size": 63488 00:15:30.010 } 00:15:30.010 ] 00:15:30.010 }' 00:15:30.010 10:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.010 10:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:30.579 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:30.840 [2024-06-10 10:10:52.515602] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:30.840 "name": "raid_bdev1", 00:15:30.840 "aliases": [ 00:15:30.840 "e6533b47-4682-42ef-b391-4f92eddd7c53" 00:15:30.840 ], 00:15:30.840 "product_name": "Raid Volume", 00:15:30.840 "block_size": 512, 00:15:30.840 "num_blocks": 63488, 00:15:30.840 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:30.840 "assigned_rate_limits": { 00:15:30.840 "rw_ios_per_sec": 0, 00:15:30.840 "rw_mbytes_per_sec": 0, 00:15:30.840 "r_mbytes_per_sec": 0, 00:15:30.840 "w_mbytes_per_sec": 0 00:15:30.840 }, 00:15:30.840 "claimed": false, 00:15:30.840 "zoned": false, 00:15:30.840 "supported_io_types": { 00:15:30.840 "read": true, 00:15:30.840 "write": true, 00:15:30.840 "unmap": false, 00:15:30.840 "write_zeroes": true, 00:15:30.840 "flush": false, 00:15:30.840 "reset": true, 00:15:30.840 "compare": false, 00:15:30.840 "compare_and_write": false, 00:15:30.840 "abort": false, 00:15:30.840 "nvme_admin": false, 00:15:30.840 "nvme_io": false 00:15:30.840 }, 00:15:30.840 "memory_domains": [ 00:15:30.840 { 00:15:30.840 "dma_device_id": "system", 00:15:30.840 "dma_device_type": 1 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.840 "dma_device_type": 2 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "dma_device_id": "system", 00:15:30.840 "dma_device_type": 1 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.840 "dma_device_type": 2 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "dma_device_id": "system", 00:15:30.840 "dma_device_type": 1 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.840 "dma_device_type": 2 00:15:30.840 } 00:15:30.840 ], 00:15:30.840 "driver_specific": { 00:15:30.840 "raid": { 00:15:30.840 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:30.840 "strip_size_kb": 0, 00:15:30.840 "state": "online", 00:15:30.840 "raid_level": "raid1", 00:15:30.840 "superblock": true, 00:15:30.840 "num_base_bdevs": 3, 00:15:30.840 "num_base_bdevs_discovered": 3, 00:15:30.840 "num_base_bdevs_operational": 3, 00:15:30.840 "base_bdevs_list": [ 00:15:30.840 { 00:15:30.840 "name": "pt1", 00:15:30.840 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:30.840 "is_configured": true, 00:15:30.840 "data_offset": 2048, 00:15:30.840 "data_size": 63488 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "name": "pt2", 00:15:30.840 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:30.840 "is_configured": true, 00:15:30.840 "data_offset": 2048, 00:15:30.840 "data_size": 63488 00:15:30.840 }, 00:15:30.840 { 00:15:30.840 "name": "pt3", 00:15:30.840 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:30.840 "is_configured": true, 00:15:30.840 "data_offset": 2048, 00:15:30.840 "data_size": 63488 00:15:30.840 } 00:15:30.840 ] 00:15:30.840 } 00:15:30.840 } 00:15:30.840 }' 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:30.840 pt2 00:15:30.840 pt3' 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:30.840 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.100 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.100 "name": "pt1", 00:15:31.100 "aliases": [ 00:15:31.100 "00000000-0000-0000-0000-000000000001" 00:15:31.100 ], 00:15:31.100 "product_name": "passthru", 00:15:31.100 "block_size": 512, 00:15:31.100 "num_blocks": 65536, 00:15:31.100 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.100 "assigned_rate_limits": { 00:15:31.100 "rw_ios_per_sec": 0, 00:15:31.100 "rw_mbytes_per_sec": 0, 00:15:31.100 "r_mbytes_per_sec": 0, 00:15:31.100 "w_mbytes_per_sec": 0 00:15:31.100 }, 00:15:31.100 "claimed": true, 00:15:31.100 "claim_type": "exclusive_write", 00:15:31.100 "zoned": false, 00:15:31.100 "supported_io_types": { 00:15:31.100 "read": true, 00:15:31.100 "write": true, 00:15:31.100 "unmap": true, 00:15:31.100 "write_zeroes": true, 00:15:31.100 "flush": true, 00:15:31.100 "reset": true, 00:15:31.101 "compare": false, 00:15:31.101 "compare_and_write": false, 00:15:31.101 "abort": true, 00:15:31.101 "nvme_admin": false, 00:15:31.101 "nvme_io": false 00:15:31.101 }, 00:15:31.101 "memory_domains": [ 00:15:31.101 { 00:15:31.101 "dma_device_id": "system", 00:15:31.101 "dma_device_type": 1 00:15:31.101 }, 00:15:31.101 { 00:15:31.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.101 "dma_device_type": 2 00:15:31.101 } 00:15:31.101 ], 00:15:31.101 "driver_specific": { 00:15:31.101 "passthru": { 00:15:31.101 "name": "pt1", 00:15:31.101 "base_bdev_name": "malloc1" 00:15:31.101 } 00:15:31.101 } 00:15:31.101 }' 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.101 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.360 10:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:31.360 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.620 "name": "pt2", 00:15:31.620 "aliases": [ 00:15:31.620 "00000000-0000-0000-0000-000000000002" 00:15:31.620 ], 00:15:31.620 "product_name": "passthru", 00:15:31.620 "block_size": 512, 00:15:31.620 "num_blocks": 65536, 00:15:31.620 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.620 "assigned_rate_limits": { 00:15:31.620 "rw_ios_per_sec": 0, 00:15:31.620 "rw_mbytes_per_sec": 0, 00:15:31.620 "r_mbytes_per_sec": 0, 00:15:31.620 "w_mbytes_per_sec": 0 00:15:31.620 }, 00:15:31.620 "claimed": true, 00:15:31.620 "claim_type": "exclusive_write", 00:15:31.620 "zoned": false, 00:15:31.620 "supported_io_types": { 00:15:31.620 "read": true, 00:15:31.620 "write": true, 00:15:31.620 "unmap": true, 00:15:31.620 "write_zeroes": true, 00:15:31.620 "flush": true, 00:15:31.620 "reset": true, 00:15:31.620 "compare": false, 00:15:31.620 "compare_and_write": false, 00:15:31.620 "abort": true, 00:15:31.620 "nvme_admin": false, 00:15:31.620 "nvme_io": false 00:15:31.620 }, 00:15:31.620 "memory_domains": [ 00:15:31.620 { 00:15:31.620 "dma_device_id": "system", 00:15:31.620 "dma_device_type": 1 00:15:31.620 }, 00:15:31.620 { 00:15:31.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.620 "dma_device_type": 2 00:15:31.620 } 00:15:31.620 ], 00:15:31.620 "driver_specific": { 00:15:31.620 "passthru": { 00:15:31.620 "name": "pt2", 00:15:31.620 "base_bdev_name": "malloc2" 00:15:31.620 } 00:15:31.620 } 00:15:31.620 }' 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.620 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:31.881 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.142 "name": "pt3", 00:15:32.142 "aliases": [ 00:15:32.142 "00000000-0000-0000-0000-000000000003" 00:15:32.142 ], 00:15:32.142 "product_name": "passthru", 00:15:32.142 "block_size": 512, 00:15:32.142 "num_blocks": 65536, 00:15:32.142 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.142 "assigned_rate_limits": { 00:15:32.142 "rw_ios_per_sec": 0, 00:15:32.142 "rw_mbytes_per_sec": 0, 00:15:32.142 "r_mbytes_per_sec": 0, 00:15:32.142 "w_mbytes_per_sec": 0 00:15:32.142 }, 00:15:32.142 "claimed": true, 00:15:32.142 "claim_type": "exclusive_write", 00:15:32.142 "zoned": false, 00:15:32.142 "supported_io_types": { 00:15:32.142 "read": true, 00:15:32.142 "write": true, 00:15:32.142 "unmap": true, 00:15:32.142 "write_zeroes": true, 00:15:32.142 "flush": true, 00:15:32.142 "reset": true, 00:15:32.142 "compare": false, 00:15:32.142 "compare_and_write": false, 00:15:32.142 "abort": true, 00:15:32.142 "nvme_admin": false, 00:15:32.142 "nvme_io": false 00:15:32.142 }, 00:15:32.142 "memory_domains": [ 00:15:32.142 { 00:15:32.142 "dma_device_id": "system", 00:15:32.142 "dma_device_type": 1 00:15:32.142 }, 00:15:32.142 { 00:15:32.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.142 "dma_device_type": 2 00:15:32.142 } 00:15:32.142 ], 00:15:32.142 "driver_specific": { 00:15:32.142 "passthru": { 00:15:32.142 "name": "pt3", 00:15:32.142 "base_bdev_name": "malloc3" 00:15:32.142 } 00:15:32.142 } 00:15:32.142 }' 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.142 10:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.142 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:32.402 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:32.662 [2024-06-10 10:10:54.344221] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:32.662 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e6533b47-4682-42ef-b391-4f92eddd7c53 00:15:32.662 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e6533b47-4682-42ef-b391-4f92eddd7c53 ']' 00:15:32.662 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:32.925 [2024-06-10 10:10:54.536517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:32.925 [2024-06-10 10:10:54.536529] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:32.925 [2024-06-10 10:10:54.536564] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:32.925 [2024-06-10 10:10:54.536615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:32.925 [2024-06-10 10:10:54.536621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d24830 name raid_bdev1, state offline 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:32.925 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:33.250 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:33.250 10:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:33.511 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:33.511 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:33.511 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:33.511 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:33.771 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:34.032 [2024-06-10 10:10:55.671345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:34.032 [2024-06-10 10:10:55.672412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:34.032 [2024-06-10 10:10:55.672444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:34.032 [2024-06-10 10:10:55.672479] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:34.032 [2024-06-10 10:10:55.672506] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:34.032 [2024-06-10 10:10:55.672520] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:34.032 [2024-06-10 10:10:55.672530] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:34.032 [2024-06-10 10:10:55.672535] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d1f900 name raid_bdev1, state configuring 00:15:34.032 request: 00:15:34.032 { 00:15:34.032 "name": "raid_bdev1", 00:15:34.032 "raid_level": "raid1", 00:15:34.032 "base_bdevs": [ 00:15:34.032 "malloc1", 00:15:34.032 "malloc2", 00:15:34.032 "malloc3" 00:15:34.032 ], 00:15:34.032 "superblock": false, 00:15:34.032 "method": "bdev_raid_create", 00:15:34.032 "req_id": 1 00:15:34.032 } 00:15:34.032 Got JSON-RPC error response 00:15:34.032 response: 00:15:34.032 { 00:15:34.032 "code": -17, 00:15:34.032 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:34.032 } 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:34.032 10:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:34.292 [2024-06-10 10:10:56.040232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:34.292 [2024-06-10 10:10:56.040255] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:34.292 [2024-06-10 10:10:56.040265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d25f50 00:15:34.292 [2024-06-10 10:10:56.040271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:34.292 [2024-06-10 10:10:56.041511] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:34.292 [2024-06-10 10:10:56.041530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:34.292 [2024-06-10 10:10:56.041573] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:34.292 [2024-06-10 10:10:56.041590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:34.292 pt1 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.292 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:34.552 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.552 "name": "raid_bdev1", 00:15:34.552 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:34.552 "strip_size_kb": 0, 00:15:34.552 "state": "configuring", 00:15:34.552 "raid_level": "raid1", 00:15:34.552 "superblock": true, 00:15:34.552 "num_base_bdevs": 3, 00:15:34.552 "num_base_bdevs_discovered": 1, 00:15:34.552 "num_base_bdevs_operational": 3, 00:15:34.552 "base_bdevs_list": [ 00:15:34.552 { 00:15:34.552 "name": "pt1", 00:15:34.552 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:34.552 "is_configured": true, 00:15:34.552 "data_offset": 2048, 00:15:34.552 "data_size": 63488 00:15:34.552 }, 00:15:34.552 { 00:15:34.552 "name": null, 00:15:34.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:34.552 "is_configured": false, 00:15:34.552 "data_offset": 2048, 00:15:34.552 "data_size": 63488 00:15:34.552 }, 00:15:34.552 { 00:15:34.552 "name": null, 00:15:34.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:34.552 "is_configured": false, 00:15:34.552 "data_offset": 2048, 00:15:34.552 "data_size": 63488 00:15:34.552 } 00:15:34.552 ] 00:15:34.552 }' 00:15:34.552 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.552 10:10:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.122 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:35.122 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:35.122 [2024-06-10 10:10:56.962563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:35.122 [2024-06-10 10:10:56.962593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.122 [2024-06-10 10:10:56.962605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b76bc0 00:15:35.122 [2024-06-10 10:10:56.962612] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.122 [2024-06-10 10:10:56.962886] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.122 [2024-06-10 10:10:56.962898] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:35.122 [2024-06-10 10:10:56.962939] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:35.122 [2024-06-10 10:10:56.962950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:35.122 pt2 00:15:35.122 10:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:35.382 [2024-06-10 10:10:57.151045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.382 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:35.643 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.643 "name": "raid_bdev1", 00:15:35.643 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:35.643 "strip_size_kb": 0, 00:15:35.643 "state": "configuring", 00:15:35.643 "raid_level": "raid1", 00:15:35.643 "superblock": true, 00:15:35.643 "num_base_bdevs": 3, 00:15:35.643 "num_base_bdevs_discovered": 1, 00:15:35.643 "num_base_bdevs_operational": 3, 00:15:35.643 "base_bdevs_list": [ 00:15:35.643 { 00:15:35.643 "name": "pt1", 00:15:35.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:35.643 "is_configured": true, 00:15:35.643 "data_offset": 2048, 00:15:35.643 "data_size": 63488 00:15:35.643 }, 00:15:35.643 { 00:15:35.643 "name": null, 00:15:35.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:35.643 "is_configured": false, 00:15:35.643 "data_offset": 2048, 00:15:35.643 "data_size": 63488 00:15:35.643 }, 00:15:35.643 { 00:15:35.643 "name": null, 00:15:35.643 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:35.643 "is_configured": false, 00:15:35.643 "data_offset": 2048, 00:15:35.643 "data_size": 63488 00:15:35.643 } 00:15:35.643 ] 00:15:35.643 }' 00:15:35.643 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.643 10:10:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.213 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:36.213 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.213 10:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:36.473 [2024-06-10 10:10:58.085407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:36.473 [2024-06-10 10:10:58.085435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.473 [2024-06-10 10:10:58.085445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d20810 00:15:36.473 [2024-06-10 10:10:58.085451] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.473 [2024-06-10 10:10:58.085717] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.473 [2024-06-10 10:10:58.085727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:36.473 [2024-06-10 10:10:58.085768] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:36.473 [2024-06-10 10:10:58.085779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:36.473 pt2 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:36.473 [2024-06-10 10:10:58.265871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:36.473 [2024-06-10 10:10:58.265888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.473 [2024-06-10 10:10:58.265896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d20f50 00:15:36.473 [2024-06-10 10:10:58.265902] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.473 [2024-06-10 10:10:58.266116] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.473 [2024-06-10 10:10:58.266126] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:36.473 [2024-06-10 10:10:58.266157] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:36.473 [2024-06-10 10:10:58.266172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:36.473 [2024-06-10 10:10:58.266249] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d24dd0 00:15:36.473 [2024-06-10 10:10:58.266255] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:36.473 [2024-06-10 10:10:58.266381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d173d0 00:15:36.473 [2024-06-10 10:10:58.266481] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d24dd0 00:15:36.473 [2024-06-10 10:10:58.266486] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d24dd0 00:15:36.473 [2024-06-10 10:10:58.266555] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.473 pt3 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.473 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.734 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.734 "name": "raid_bdev1", 00:15:36.734 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:36.734 "strip_size_kb": 0, 00:15:36.734 "state": "online", 00:15:36.734 "raid_level": "raid1", 00:15:36.734 "superblock": true, 00:15:36.734 "num_base_bdevs": 3, 00:15:36.734 "num_base_bdevs_discovered": 3, 00:15:36.734 "num_base_bdevs_operational": 3, 00:15:36.734 "base_bdevs_list": [ 00:15:36.734 { 00:15:36.734 "name": "pt1", 00:15:36.734 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:36.734 "is_configured": true, 00:15:36.734 "data_offset": 2048, 00:15:36.734 "data_size": 63488 00:15:36.734 }, 00:15:36.734 { 00:15:36.734 "name": "pt2", 00:15:36.734 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:36.734 "is_configured": true, 00:15:36.734 "data_offset": 2048, 00:15:36.734 "data_size": 63488 00:15:36.734 }, 00:15:36.734 { 00:15:36.734 "name": "pt3", 00:15:36.734 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:36.734 "is_configured": true, 00:15:36.734 "data_offset": 2048, 00:15:36.734 "data_size": 63488 00:15:36.734 } 00:15:36.734 ] 00:15:36.734 }' 00:15:36.734 10:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.734 10:10:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:37.303 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:37.304 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:37.304 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:37.564 [2024-06-10 10:10:59.176374] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:37.564 "name": "raid_bdev1", 00:15:37.564 "aliases": [ 00:15:37.564 "e6533b47-4682-42ef-b391-4f92eddd7c53" 00:15:37.564 ], 00:15:37.564 "product_name": "Raid Volume", 00:15:37.564 "block_size": 512, 00:15:37.564 "num_blocks": 63488, 00:15:37.564 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:37.564 "assigned_rate_limits": { 00:15:37.564 "rw_ios_per_sec": 0, 00:15:37.564 "rw_mbytes_per_sec": 0, 00:15:37.564 "r_mbytes_per_sec": 0, 00:15:37.564 "w_mbytes_per_sec": 0 00:15:37.564 }, 00:15:37.564 "claimed": false, 00:15:37.564 "zoned": false, 00:15:37.564 "supported_io_types": { 00:15:37.564 "read": true, 00:15:37.564 "write": true, 00:15:37.564 "unmap": false, 00:15:37.564 "write_zeroes": true, 00:15:37.564 "flush": false, 00:15:37.564 "reset": true, 00:15:37.564 "compare": false, 00:15:37.564 "compare_and_write": false, 00:15:37.564 "abort": false, 00:15:37.564 "nvme_admin": false, 00:15:37.564 "nvme_io": false 00:15:37.564 }, 00:15:37.564 "memory_domains": [ 00:15:37.564 { 00:15:37.564 "dma_device_id": "system", 00:15:37.564 "dma_device_type": 1 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.564 "dma_device_type": 2 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "dma_device_id": "system", 00:15:37.564 "dma_device_type": 1 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.564 "dma_device_type": 2 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "dma_device_id": "system", 00:15:37.564 "dma_device_type": 1 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.564 "dma_device_type": 2 00:15:37.564 } 00:15:37.564 ], 00:15:37.564 "driver_specific": { 00:15:37.564 "raid": { 00:15:37.564 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:37.564 "strip_size_kb": 0, 00:15:37.564 "state": "online", 00:15:37.564 "raid_level": "raid1", 00:15:37.564 "superblock": true, 00:15:37.564 "num_base_bdevs": 3, 00:15:37.564 "num_base_bdevs_discovered": 3, 00:15:37.564 "num_base_bdevs_operational": 3, 00:15:37.564 "base_bdevs_list": [ 00:15:37.564 { 00:15:37.564 "name": "pt1", 00:15:37.564 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.564 "is_configured": true, 00:15:37.564 "data_offset": 2048, 00:15:37.564 "data_size": 63488 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "name": "pt2", 00:15:37.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.564 "is_configured": true, 00:15:37.564 "data_offset": 2048, 00:15:37.564 "data_size": 63488 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "name": "pt3", 00:15:37.564 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:37.564 "is_configured": true, 00:15:37.564 "data_offset": 2048, 00:15:37.564 "data_size": 63488 00:15:37.564 } 00:15:37.564 ] 00:15:37.564 } 00:15:37.564 } 00:15:37.564 }' 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:37.564 pt2 00:15:37.564 pt3' 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.564 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.825 "name": "pt1", 00:15:37.825 "aliases": [ 00:15:37.825 "00000000-0000-0000-0000-000000000001" 00:15:37.825 ], 00:15:37.825 "product_name": "passthru", 00:15:37.825 "block_size": 512, 00:15:37.825 "num_blocks": 65536, 00:15:37.825 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.825 "assigned_rate_limits": { 00:15:37.825 "rw_ios_per_sec": 0, 00:15:37.825 "rw_mbytes_per_sec": 0, 00:15:37.825 "r_mbytes_per_sec": 0, 00:15:37.825 "w_mbytes_per_sec": 0 00:15:37.825 }, 00:15:37.825 "claimed": true, 00:15:37.825 "claim_type": "exclusive_write", 00:15:37.825 "zoned": false, 00:15:37.825 "supported_io_types": { 00:15:37.825 "read": true, 00:15:37.825 "write": true, 00:15:37.825 "unmap": true, 00:15:37.825 "write_zeroes": true, 00:15:37.825 "flush": true, 00:15:37.825 "reset": true, 00:15:37.825 "compare": false, 00:15:37.825 "compare_and_write": false, 00:15:37.825 "abort": true, 00:15:37.825 "nvme_admin": false, 00:15:37.825 "nvme_io": false 00:15:37.825 }, 00:15:37.825 "memory_domains": [ 00:15:37.825 { 00:15:37.825 "dma_device_id": "system", 00:15:37.825 "dma_device_type": 1 00:15:37.825 }, 00:15:37.825 { 00:15:37.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.825 "dma_device_type": 2 00:15:37.825 } 00:15:37.825 ], 00:15:37.825 "driver_specific": { 00:15:37.825 "passthru": { 00:15:37.825 "name": "pt1", 00:15:37.825 "base_bdev_name": "malloc1" 00:15:37.825 } 00:15:37.825 } 00:15:37.825 }' 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.825 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:38.085 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.346 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.346 "name": "pt2", 00:15:38.346 "aliases": [ 00:15:38.346 "00000000-0000-0000-0000-000000000002" 00:15:38.346 ], 00:15:38.346 "product_name": "passthru", 00:15:38.346 "block_size": 512, 00:15:38.346 "num_blocks": 65536, 00:15:38.346 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.346 "assigned_rate_limits": { 00:15:38.346 "rw_ios_per_sec": 0, 00:15:38.346 "rw_mbytes_per_sec": 0, 00:15:38.346 "r_mbytes_per_sec": 0, 00:15:38.346 "w_mbytes_per_sec": 0 00:15:38.346 }, 00:15:38.346 "claimed": true, 00:15:38.346 "claim_type": "exclusive_write", 00:15:38.346 "zoned": false, 00:15:38.346 "supported_io_types": { 00:15:38.346 "read": true, 00:15:38.346 "write": true, 00:15:38.346 "unmap": true, 00:15:38.346 "write_zeroes": true, 00:15:38.346 "flush": true, 00:15:38.346 "reset": true, 00:15:38.346 "compare": false, 00:15:38.346 "compare_and_write": false, 00:15:38.346 "abort": true, 00:15:38.346 "nvme_admin": false, 00:15:38.346 "nvme_io": false 00:15:38.346 }, 00:15:38.346 "memory_domains": [ 00:15:38.346 { 00:15:38.346 "dma_device_id": "system", 00:15:38.346 "dma_device_type": 1 00:15:38.346 }, 00:15:38.346 { 00:15:38.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.346 "dma_device_type": 2 00:15:38.346 } 00:15:38.346 ], 00:15:38.346 "driver_specific": { 00:15:38.346 "passthru": { 00:15:38.346 "name": "pt2", 00:15:38.346 "base_bdev_name": "malloc2" 00:15:38.346 } 00:15:38.346 } 00:15:38.346 }' 00:15:38.346 10:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.346 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:38.607 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.867 "name": "pt3", 00:15:38.867 "aliases": [ 00:15:38.867 "00000000-0000-0000-0000-000000000003" 00:15:38.867 ], 00:15:38.867 "product_name": "passthru", 00:15:38.867 "block_size": 512, 00:15:38.867 "num_blocks": 65536, 00:15:38.867 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.867 "assigned_rate_limits": { 00:15:38.867 "rw_ios_per_sec": 0, 00:15:38.867 "rw_mbytes_per_sec": 0, 00:15:38.867 "r_mbytes_per_sec": 0, 00:15:38.867 "w_mbytes_per_sec": 0 00:15:38.867 }, 00:15:38.867 "claimed": true, 00:15:38.867 "claim_type": "exclusive_write", 00:15:38.867 "zoned": false, 00:15:38.867 "supported_io_types": { 00:15:38.867 "read": true, 00:15:38.867 "write": true, 00:15:38.867 "unmap": true, 00:15:38.867 "write_zeroes": true, 00:15:38.867 "flush": true, 00:15:38.867 "reset": true, 00:15:38.867 "compare": false, 00:15:38.867 "compare_and_write": false, 00:15:38.867 "abort": true, 00:15:38.867 "nvme_admin": false, 00:15:38.867 "nvme_io": false 00:15:38.867 }, 00:15:38.867 "memory_domains": [ 00:15:38.867 { 00:15:38.867 "dma_device_id": "system", 00:15:38.867 "dma_device_type": 1 00:15:38.867 }, 00:15:38.867 { 00:15:38.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.867 "dma_device_type": 2 00:15:38.867 } 00:15:38.867 ], 00:15:38.867 "driver_specific": { 00:15:38.867 "passthru": { 00:15:38.867 "name": "pt3", 00:15:38.867 "base_bdev_name": "malloc3" 00:15:38.867 } 00:15:38.867 } 00:15:38.867 }' 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.867 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:39.128 10:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:39.388 [2024-06-10 10:11:00.997054] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e6533b47-4682-42ef-b391-4f92eddd7c53 '!=' e6533b47-4682-42ef-b391-4f92eddd7c53 ']' 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:39.388 [2024-06-10 10:11:01.193372] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.388 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:39.648 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.648 "name": "raid_bdev1", 00:15:39.648 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:39.648 "strip_size_kb": 0, 00:15:39.648 "state": "online", 00:15:39.648 "raid_level": "raid1", 00:15:39.648 "superblock": true, 00:15:39.648 "num_base_bdevs": 3, 00:15:39.648 "num_base_bdevs_discovered": 2, 00:15:39.648 "num_base_bdevs_operational": 2, 00:15:39.648 "base_bdevs_list": [ 00:15:39.648 { 00:15:39.648 "name": null, 00:15:39.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.648 "is_configured": false, 00:15:39.648 "data_offset": 2048, 00:15:39.648 "data_size": 63488 00:15:39.648 }, 00:15:39.648 { 00:15:39.648 "name": "pt2", 00:15:39.648 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.648 "is_configured": true, 00:15:39.648 "data_offset": 2048, 00:15:39.648 "data_size": 63488 00:15:39.648 }, 00:15:39.648 { 00:15:39.648 "name": "pt3", 00:15:39.648 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:39.648 "is_configured": true, 00:15:39.648 "data_offset": 2048, 00:15:39.648 "data_size": 63488 00:15:39.648 } 00:15:39.648 ] 00:15:39.648 }' 00:15:39.648 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.648 10:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.218 10:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:40.480 [2024-06-10 10:11:02.127728] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.480 [2024-06-10 10:11:02.127743] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.480 [2024-06-10 10:11:02.127773] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.480 [2024-06-10 10:11:02.127813] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.480 [2024-06-10 10:11:02.127819] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d24dd0 name raid_bdev1, state offline 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:40.480 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:40.741 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:15:40.741 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:40.741 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:41.001 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:15:41.001 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:41.001 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:15:41.001 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:15:41.001 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:41.261 [2024-06-10 10:11:02.881604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:41.261 [2024-06-10 10:11:02.881632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.261 [2024-06-10 10:11:02.881642] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d24ab0 00:15:41.261 [2024-06-10 10:11:02.881653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.261 [2024-06-10 10:11:02.882943] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.261 [2024-06-10 10:11:02.882965] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:41.261 [2024-06-10 10:11:02.883014] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:41.261 [2024-06-10 10:11:02.883033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:41.261 pt2 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.261 10:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.261 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.261 "name": "raid_bdev1", 00:15:41.261 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:41.261 "strip_size_kb": 0, 00:15:41.261 "state": "configuring", 00:15:41.261 "raid_level": "raid1", 00:15:41.261 "superblock": true, 00:15:41.261 "num_base_bdevs": 3, 00:15:41.261 "num_base_bdevs_discovered": 1, 00:15:41.261 "num_base_bdevs_operational": 2, 00:15:41.261 "base_bdevs_list": [ 00:15:41.261 { 00:15:41.261 "name": null, 00:15:41.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.261 "is_configured": false, 00:15:41.261 "data_offset": 2048, 00:15:41.261 "data_size": 63488 00:15:41.261 }, 00:15:41.261 { 00:15:41.261 "name": "pt2", 00:15:41.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.261 "is_configured": true, 00:15:41.261 "data_offset": 2048, 00:15:41.261 "data_size": 63488 00:15:41.261 }, 00:15:41.261 { 00:15:41.261 "name": null, 00:15:41.261 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.261 "is_configured": false, 00:15:41.261 "data_offset": 2048, 00:15:41.261 "data_size": 63488 00:15:41.261 } 00:15:41.261 ] 00:15:41.261 }' 00:15:41.261 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.261 10:11:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.831 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:15:41.831 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:15:41.831 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:15:41.831 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:42.091 [2024-06-10 10:11:03.807967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:42.091 [2024-06-10 10:11:03.807994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.091 [2024-06-10 10:11:03.808003] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d214d0 00:15:42.091 [2024-06-10 10:11:03.808010] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.091 [2024-06-10 10:11:03.808267] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.091 [2024-06-10 10:11:03.808278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:42.091 [2024-06-10 10:11:03.808321] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:42.091 [2024-06-10 10:11:03.808332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:42.091 [2024-06-10 10:11:03.808406] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d17060 00:15:42.091 [2024-06-10 10:11:03.808412] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:42.091 [2024-06-10 10:11:03.808538] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d28ad0 00:15:42.091 [2024-06-10 10:11:03.808635] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d17060 00:15:42.091 [2024-06-10 10:11:03.808640] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d17060 00:15:42.091 [2024-06-10 10:11:03.808710] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:42.091 pt3 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.091 10:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:42.351 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.351 "name": "raid_bdev1", 00:15:42.351 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:42.351 "strip_size_kb": 0, 00:15:42.351 "state": "online", 00:15:42.351 "raid_level": "raid1", 00:15:42.351 "superblock": true, 00:15:42.351 "num_base_bdevs": 3, 00:15:42.351 "num_base_bdevs_discovered": 2, 00:15:42.351 "num_base_bdevs_operational": 2, 00:15:42.351 "base_bdevs_list": [ 00:15:42.351 { 00:15:42.351 "name": null, 00:15:42.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.351 "is_configured": false, 00:15:42.351 "data_offset": 2048, 00:15:42.351 "data_size": 63488 00:15:42.351 }, 00:15:42.351 { 00:15:42.351 "name": "pt2", 00:15:42.351 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.351 "is_configured": true, 00:15:42.351 "data_offset": 2048, 00:15:42.351 "data_size": 63488 00:15:42.351 }, 00:15:42.351 { 00:15:42.351 "name": "pt3", 00:15:42.351 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:42.351 "is_configured": true, 00:15:42.351 "data_offset": 2048, 00:15:42.351 "data_size": 63488 00:15:42.351 } 00:15:42.351 ] 00:15:42.351 }' 00:15:42.351 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.351 10:11:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.921 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:42.921 [2024-06-10 10:11:04.698221] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:42.921 [2024-06-10 10:11:04.698237] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.921 [2024-06-10 10:11:04.698277] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.921 [2024-06-10 10:11:04.698316] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.922 [2024-06-10 10:11:04.698322] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d17060 name raid_bdev1, state offline 00:15:42.922 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.922 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:15:43.181 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:15:43.181 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:15:43.181 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:15:43.181 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:15:43.181 10:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:43.441 [2024-06-10 10:11:05.243578] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:43.441 [2024-06-10 10:11:05.243606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.441 [2024-06-10 10:11:05.243616] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d23880 00:15:43.441 [2024-06-10 10:11:05.243624] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.441 [2024-06-10 10:11:05.244890] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.441 [2024-06-10 10:11:05.244907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:43.441 [2024-06-10 10:11:05.244951] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:43.441 [2024-06-10 10:11:05.244967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:43.441 [2024-06-10 10:11:05.245037] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:43.441 [2024-06-10 10:11:05.245044] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:43.441 [2024-06-10 10:11:05.245052] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b75680 name raid_bdev1, state configuring 00:15:43.441 [2024-06-10 10:11:05.245066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:43.441 pt1 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.441 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.701 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.701 "name": "raid_bdev1", 00:15:43.701 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:43.701 "strip_size_kb": 0, 00:15:43.701 "state": "configuring", 00:15:43.701 "raid_level": "raid1", 00:15:43.701 "superblock": true, 00:15:43.701 "num_base_bdevs": 3, 00:15:43.701 "num_base_bdevs_discovered": 1, 00:15:43.701 "num_base_bdevs_operational": 2, 00:15:43.701 "base_bdevs_list": [ 00:15:43.701 { 00:15:43.701 "name": null, 00:15:43.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.701 "is_configured": false, 00:15:43.701 "data_offset": 2048, 00:15:43.701 "data_size": 63488 00:15:43.701 }, 00:15:43.701 { 00:15:43.701 "name": "pt2", 00:15:43.701 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.701 "is_configured": true, 00:15:43.701 "data_offset": 2048, 00:15:43.701 "data_size": 63488 00:15:43.701 }, 00:15:43.701 { 00:15:43.701 "name": null, 00:15:43.701 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.701 "is_configured": false, 00:15:43.701 "data_offset": 2048, 00:15:43.701 "data_size": 63488 00:15:43.701 } 00:15:43.701 ] 00:15:43.701 }' 00:15:43.701 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.701 10:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.270 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:44.270 10:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:44.530 [2024-06-10 10:11:06.314289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:44.530 [2024-06-10 10:11:06.314320] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:44.530 [2024-06-10 10:11:06.314331] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d27fa0 00:15:44.530 [2024-06-10 10:11:06.314337] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:44.530 [2024-06-10 10:11:06.314613] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:44.530 [2024-06-10 10:11:06.314623] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:44.530 [2024-06-10 10:11:06.314666] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:44.530 [2024-06-10 10:11:06.314678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:44.530 [2024-06-10 10:11:06.314750] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d276f0 00:15:44.530 [2024-06-10 10:11:06.314755] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:44.530 [2024-06-10 10:11:06.314895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d28ad0 00:15:44.530 [2024-06-10 10:11:06.314994] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d276f0 00:15:44.530 [2024-06-10 10:11:06.314999] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d276f0 00:15:44.530 [2024-06-10 10:11:06.315070] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:44.530 pt3 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.530 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.531 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.531 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.790 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.790 "name": "raid_bdev1", 00:15:44.790 "uuid": "e6533b47-4682-42ef-b391-4f92eddd7c53", 00:15:44.790 "strip_size_kb": 0, 00:15:44.790 "state": "online", 00:15:44.790 "raid_level": "raid1", 00:15:44.790 "superblock": true, 00:15:44.790 "num_base_bdevs": 3, 00:15:44.790 "num_base_bdevs_discovered": 2, 00:15:44.790 "num_base_bdevs_operational": 2, 00:15:44.790 "base_bdevs_list": [ 00:15:44.790 { 00:15:44.790 "name": null, 00:15:44.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.790 "is_configured": false, 00:15:44.790 "data_offset": 2048, 00:15:44.790 "data_size": 63488 00:15:44.790 }, 00:15:44.790 { 00:15:44.790 "name": "pt2", 00:15:44.790 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.790 "is_configured": true, 00:15:44.790 "data_offset": 2048, 00:15:44.790 "data_size": 63488 00:15:44.790 }, 00:15:44.790 { 00:15:44.790 "name": "pt3", 00:15:44.790 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.790 "is_configured": true, 00:15:44.790 "data_offset": 2048, 00:15:44.790 "data_size": 63488 00:15:44.790 } 00:15:44.790 ] 00:15:44.790 }' 00:15:44.791 10:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.791 10:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.360 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:45.360 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:15:45.621 [2024-06-10 10:11:07.433288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' e6533b47-4682-42ef-b391-4f92eddd7c53 '!=' e6533b47-4682-42ef-b391-4f92eddd7c53 ']' 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1015609 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1015609 ']' 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1015609 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1015609 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1015609' 00:15:45.621 killing process with pid 1015609 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1015609 00:15:45.621 [2024-06-10 10:11:07.483098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:45.621 [2024-06-10 10:11:07.483136] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:45.621 [2024-06-10 10:11:07.483176] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:45.621 [2024-06-10 10:11:07.483181] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d276f0 name raid_bdev1, state offline 00:15:45.621 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1015609 00:15:45.881 [2024-06-10 10:11:07.498226] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:45.881 10:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:45.881 00:15:45.881 real 0m18.219s 00:15:45.881 user 0m33.944s 00:15:45.881 sys 0m2.675s 00:15:45.881 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:45.881 10:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.881 ************************************ 00:15:45.881 END TEST raid_superblock_test 00:15:45.881 ************************************ 00:15:45.881 10:11:07 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:15:45.881 10:11:07 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:45.881 10:11:07 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:45.881 10:11:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:45.881 ************************************ 00:15:45.881 START TEST raid_read_error_test 00:15:45.881 ************************************ 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 read 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.JzXKzdQM1W 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1019133 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1019133 /var/tmp/spdk-raid.sock 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1019133 ']' 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:45.881 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:45.881 10:11:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.142 [2024-06-10 10:11:07.761775] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:15:46.142 [2024-06-10 10:11:07.761820] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1019133 ] 00:15:46.142 [2024-06-10 10:11:07.847692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:46.142 [2024-06-10 10:11:07.911192] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:46.142 [2024-06-10 10:11:07.956233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.142 [2024-06-10 10:11:07.956252] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:47.083 10:11:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:47.083 10:11:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:15:47.083 10:11:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:47.083 10:11:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:47.083 BaseBdev1_malloc 00:15:47.083 10:11:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:47.344 true 00:15:47.344 10:11:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:47.344 [2024-06-10 10:11:09.150818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:47.344 [2024-06-10 10:11:09.150853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.344 [2024-06-10 10:11:09.150864] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2966d10 00:15:47.344 [2024-06-10 10:11:09.150870] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.344 [2024-06-10 10:11:09.152240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.344 [2024-06-10 10:11:09.152260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:47.344 BaseBdev1 00:15:47.344 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:47.344 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:47.605 BaseBdev2_malloc 00:15:47.605 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:47.866 true 00:15:47.866 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:47.866 [2024-06-10 10:11:09.706099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:47.866 [2024-06-10 10:11:09.706127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.866 [2024-06-10 10:11:09.706138] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x296b710 00:15:47.866 [2024-06-10 10:11:09.706144] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.866 [2024-06-10 10:11:09.707326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.866 [2024-06-10 10:11:09.707345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:47.866 BaseBdev2 00:15:47.866 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:47.866 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:48.126 BaseBdev3_malloc 00:15:48.127 10:11:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:48.387 true 00:15:48.387 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:48.648 [2024-06-10 10:11:10.273483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:48.648 [2024-06-10 10:11:10.273513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.648 [2024-06-10 10:11:10.273524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x296c340 00:15:48.648 [2024-06-10 10:11:10.273530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.648 [2024-06-10 10:11:10.274740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.648 [2024-06-10 10:11:10.274758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:48.648 BaseBdev3 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:48.648 [2024-06-10 10:11:10.465982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.648 [2024-06-10 10:11:10.467001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.648 [2024-06-10 10:11:10.467053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.648 [2024-06-10 10:11:10.467213] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x296f160 00:15:48.648 [2024-06-10 10:11:10.467220] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:48.648 [2024-06-10 10:11:10.467365] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x29686e0 00:15:48.648 [2024-06-10 10:11:10.467486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x296f160 00:15:48.648 [2024-06-10 10:11:10.467492] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x296f160 00:15:48.648 [2024-06-10 10:11:10.467566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.648 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.909 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.909 "name": "raid_bdev1", 00:15:48.909 "uuid": "2a5d92f2-9e07-4f7a-8612-87158d4df215", 00:15:48.909 "strip_size_kb": 0, 00:15:48.909 "state": "online", 00:15:48.909 "raid_level": "raid1", 00:15:48.909 "superblock": true, 00:15:48.909 "num_base_bdevs": 3, 00:15:48.909 "num_base_bdevs_discovered": 3, 00:15:48.909 "num_base_bdevs_operational": 3, 00:15:48.909 "base_bdevs_list": [ 00:15:48.909 { 00:15:48.909 "name": "BaseBdev1", 00:15:48.909 "uuid": "7d6e167f-520a-5483-84e9-46d82cde2e02", 00:15:48.909 "is_configured": true, 00:15:48.909 "data_offset": 2048, 00:15:48.909 "data_size": 63488 00:15:48.909 }, 00:15:48.909 { 00:15:48.909 "name": "BaseBdev2", 00:15:48.909 "uuid": "5928d85a-8d1a-549c-ae9f-9c72634f0cec", 00:15:48.909 "is_configured": true, 00:15:48.909 "data_offset": 2048, 00:15:48.909 "data_size": 63488 00:15:48.909 }, 00:15:48.909 { 00:15:48.909 "name": "BaseBdev3", 00:15:48.909 "uuid": "e9bd0e58-de00-5e95-8525-3e84897b5cc0", 00:15:48.909 "is_configured": true, 00:15:48.909 "data_offset": 2048, 00:15:48.909 "data_size": 63488 00:15:48.909 } 00:15:48.909 ] 00:15:48.909 }' 00:15:48.909 10:11:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.909 10:11:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.480 10:11:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:49.480 10:11:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:49.480 [2024-06-10 10:11:11.252151] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2968080 00:15:50.434 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.694 "name": "raid_bdev1", 00:15:50.694 "uuid": "2a5d92f2-9e07-4f7a-8612-87158d4df215", 00:15:50.694 "strip_size_kb": 0, 00:15:50.694 "state": "online", 00:15:50.694 "raid_level": "raid1", 00:15:50.694 "superblock": true, 00:15:50.694 "num_base_bdevs": 3, 00:15:50.694 "num_base_bdevs_discovered": 3, 00:15:50.694 "num_base_bdevs_operational": 3, 00:15:50.694 "base_bdevs_list": [ 00:15:50.694 { 00:15:50.694 "name": "BaseBdev1", 00:15:50.694 "uuid": "7d6e167f-520a-5483-84e9-46d82cde2e02", 00:15:50.694 "is_configured": true, 00:15:50.694 "data_offset": 2048, 00:15:50.694 "data_size": 63488 00:15:50.694 }, 00:15:50.694 { 00:15:50.694 "name": "BaseBdev2", 00:15:50.694 "uuid": "5928d85a-8d1a-549c-ae9f-9c72634f0cec", 00:15:50.694 "is_configured": true, 00:15:50.694 "data_offset": 2048, 00:15:50.694 "data_size": 63488 00:15:50.694 }, 00:15:50.694 { 00:15:50.694 "name": "BaseBdev3", 00:15:50.694 "uuid": "e9bd0e58-de00-5e95-8525-3e84897b5cc0", 00:15:50.694 "is_configured": true, 00:15:50.694 "data_offset": 2048, 00:15:50.694 "data_size": 63488 00:15:50.694 } 00:15:50.694 ] 00:15:50.694 }' 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.694 10:11:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.264 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:51.525 [2024-06-10 10:11:13.224408] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:51.525 [2024-06-10 10:11:13.224437] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:51.525 [2024-06-10 10:11:13.227000] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:51.525 [2024-06-10 10:11:13.227024] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.525 [2024-06-10 10:11:13.227099] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:51.525 [2024-06-10 10:11:13.227105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x296f160 name raid_bdev1, state offline 00:15:51.525 0 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1019133 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1019133 ']' 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1019133 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1019133 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1019133' 00:15:51.525 killing process with pid 1019133 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1019133 00:15:51.525 [2024-06-10 10:11:13.275335] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:51.525 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1019133 00:15:51.525 [2024-06-10 10:11:13.286606] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.JzXKzdQM1W 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:51.786 00:15:51.786 real 0m5.722s 00:15:51.786 user 0m9.067s 00:15:51.786 sys 0m0.814s 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.786 10:11:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.786 ************************************ 00:15:51.786 END TEST raid_read_error_test 00:15:51.786 ************************************ 00:15:51.786 10:11:13 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:15:51.786 10:11:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:51.786 10:11:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.786 10:11:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:51.786 ************************************ 00:15:51.786 START TEST raid_write_error_test 00:15:51.786 ************************************ 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 write 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0tDvOq7S9T 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1020170 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1020170 /var/tmp/spdk-raid.sock 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1020170 ']' 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:51.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:51.786 10:11:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.786 [2024-06-10 10:11:13.558089] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:15:51.786 [2024-06-10 10:11:13.558138] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1020170 ] 00:15:51.786 [2024-06-10 10:11:13.624984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:52.046 [2024-06-10 10:11:13.690462] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:52.046 [2024-06-10 10:11:13.731583] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.046 [2024-06-10 10:11:13.731606] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.660 10:11:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:52.660 10:11:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:15:52.660 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:52.660 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:52.944 BaseBdev1_malloc 00:15:52.944 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:52.944 true 00:15:52.944 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:53.204 [2024-06-10 10:11:14.926848] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:53.204 [2024-06-10 10:11:14.926890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.204 [2024-06-10 10:11:14.926902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe8ed10 00:15:53.204 [2024-06-10 10:11:14.926908] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.204 [2024-06-10 10:11:14.928288] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.204 [2024-06-10 10:11:14.928308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:53.204 BaseBdev1 00:15:53.204 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:53.204 10:11:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:53.464 BaseBdev2_malloc 00:15:53.464 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:53.464 true 00:15:53.464 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:53.724 [2024-06-10 10:11:15.486144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:53.724 [2024-06-10 10:11:15.486171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.724 [2024-06-10 10:11:15.486183] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe93710 00:15:53.724 [2024-06-10 10:11:15.486189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.724 [2024-06-10 10:11:15.487373] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.724 [2024-06-10 10:11:15.487392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:53.724 BaseBdev2 00:15:53.724 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:53.725 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:53.984 BaseBdev3_malloc 00:15:53.984 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:54.244 true 00:15:54.245 10:11:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:54.245 [2024-06-10 10:11:16.041472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:54.245 [2024-06-10 10:11:16.041498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.245 [2024-06-10 10:11:16.041508] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe94340 00:15:54.245 [2024-06-10 10:11:16.041514] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.245 [2024-06-10 10:11:16.042685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.245 [2024-06-10 10:11:16.042708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:54.245 BaseBdev3 00:15:54.245 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:54.505 [2024-06-10 10:11:16.229967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.505 [2024-06-10 10:11:16.230965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:54.505 [2024-06-10 10:11:16.231018] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:54.505 [2024-06-10 10:11:16.231176] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe97160 00:15:54.505 [2024-06-10 10:11:16.231183] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:54.505 [2024-06-10 10:11:16.231326] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe906e0 00:15:54.505 [2024-06-10 10:11:16.231445] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe97160 00:15:54.505 [2024-06-10 10:11:16.231450] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe97160 00:15:54.505 [2024-06-10 10:11:16.231524] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.505 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.797 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.797 "name": "raid_bdev1", 00:15:54.797 "uuid": "39741faa-0f82-4695-9c0f-c606ab8be5e2", 00:15:54.797 "strip_size_kb": 0, 00:15:54.797 "state": "online", 00:15:54.797 "raid_level": "raid1", 00:15:54.797 "superblock": true, 00:15:54.797 "num_base_bdevs": 3, 00:15:54.797 "num_base_bdevs_discovered": 3, 00:15:54.797 "num_base_bdevs_operational": 3, 00:15:54.797 "base_bdevs_list": [ 00:15:54.797 { 00:15:54.797 "name": "BaseBdev1", 00:15:54.797 "uuid": "5bf7cb57-430a-5b2d-971d-bd4b3722a3dc", 00:15:54.797 "is_configured": true, 00:15:54.797 "data_offset": 2048, 00:15:54.797 "data_size": 63488 00:15:54.797 }, 00:15:54.797 { 00:15:54.797 "name": "BaseBdev2", 00:15:54.798 "uuid": "07f629da-555d-57bd-9fb4-34b0801101f8", 00:15:54.798 "is_configured": true, 00:15:54.798 "data_offset": 2048, 00:15:54.798 "data_size": 63488 00:15:54.798 }, 00:15:54.798 { 00:15:54.798 "name": "BaseBdev3", 00:15:54.798 "uuid": "23f9d00f-c579-517c-94a7-400a6cad07ff", 00:15:54.798 "is_configured": true, 00:15:54.798 "data_offset": 2048, 00:15:54.798 "data_size": 63488 00:15:54.798 } 00:15:54.798 ] 00:15:54.798 }' 00:15:54.798 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.798 10:11:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.367 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:55.367 10:11:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:55.367 [2024-06-10 10:11:17.024160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe90080 00:15:56.305 10:11:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:56.305 [2024-06-10 10:11:18.115344] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:15:56.305 [2024-06-10 10:11:18.115384] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:56.305 [2024-06-10 10:11:18.115559] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe90080 00:15:56.305 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:56.305 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:56.305 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:15:56.305 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.306 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:56.566 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.566 "name": "raid_bdev1", 00:15:56.566 "uuid": "39741faa-0f82-4695-9c0f-c606ab8be5e2", 00:15:56.566 "strip_size_kb": 0, 00:15:56.566 "state": "online", 00:15:56.566 "raid_level": "raid1", 00:15:56.566 "superblock": true, 00:15:56.566 "num_base_bdevs": 3, 00:15:56.566 "num_base_bdevs_discovered": 2, 00:15:56.566 "num_base_bdevs_operational": 2, 00:15:56.566 "base_bdevs_list": [ 00:15:56.566 { 00:15:56.566 "name": null, 00:15:56.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:56.566 "is_configured": false, 00:15:56.566 "data_offset": 2048, 00:15:56.566 "data_size": 63488 00:15:56.566 }, 00:15:56.566 { 00:15:56.566 "name": "BaseBdev2", 00:15:56.566 "uuid": "07f629da-555d-57bd-9fb4-34b0801101f8", 00:15:56.566 "is_configured": true, 00:15:56.566 "data_offset": 2048, 00:15:56.566 "data_size": 63488 00:15:56.566 }, 00:15:56.566 { 00:15:56.566 "name": "BaseBdev3", 00:15:56.566 "uuid": "23f9d00f-c579-517c-94a7-400a6cad07ff", 00:15:56.566 "is_configured": true, 00:15:56.566 "data_offset": 2048, 00:15:56.566 "data_size": 63488 00:15:56.566 } 00:15:56.566 ] 00:15:56.566 }' 00:15:56.566 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.566 10:11:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.135 10:11:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:57.395 [2024-06-10 10:11:19.002378] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:57.395 [2024-06-10 10:11:19.002405] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:57.395 [2024-06-10 10:11:19.004977] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:57.395 [2024-06-10 10:11:19.004998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.395 [2024-06-10 10:11:19.005056] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:57.395 [2024-06-10 10:11:19.005067] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe97160 name raid_bdev1, state offline 00:15:57.395 0 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1020170 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1020170 ']' 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1020170 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1020170 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:57.395 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1020170' 00:15:57.395 killing process with pid 1020170 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1020170 00:15:57.396 [2024-06-10 10:11:19.068352] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1020170 00:15:57.396 [2024-06-10 10:11:19.079591] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0tDvOq7S9T 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:57.396 00:15:57.396 real 0m5.722s 00:15:57.396 user 0m9.110s 00:15:57.396 sys 0m0.765s 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:57.396 10:11:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.396 ************************************ 00:15:57.396 END TEST raid_write_error_test 00:15:57.396 ************************************ 00:15:57.396 10:11:19 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:15:57.396 10:11:19 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:57.396 10:11:19 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:15:57.396 10:11:19 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:57.396 10:11:19 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:57.396 10:11:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:57.656 ************************************ 00:15:57.656 START TEST raid_state_function_test 00:15:57.656 ************************************ 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 false 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1021248 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1021248' 00:15:57.656 Process raid pid: 1021248 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1021248 /var/tmp/spdk-raid.sock 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1021248 ']' 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:57.656 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:57.656 10:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.656 [2024-06-10 10:11:19.348014] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:15:57.656 [2024-06-10 10:11:19.348055] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:57.656 [2024-06-10 10:11:19.432425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:57.656 [2024-06-10 10:11:19.494269] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:57.915 [2024-06-10 10:11:19.548712] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.915 [2024-06-10 10:11:19.548732] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:58.485 [2024-06-10 10:11:20.327936] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.485 [2024-06-10 10:11:20.327966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.485 [2024-06-10 10:11:20.327972] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.485 [2024-06-10 10:11:20.327977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.485 [2024-06-10 10:11:20.327984] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.485 [2024-06-10 10:11:20.327989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.485 [2024-06-10 10:11:20.327994] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:58.485 [2024-06-10 10:11:20.327999] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.485 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.745 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.745 "name": "Existed_Raid", 00:15:58.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.745 "strip_size_kb": 64, 00:15:58.745 "state": "configuring", 00:15:58.745 "raid_level": "raid0", 00:15:58.745 "superblock": false, 00:15:58.745 "num_base_bdevs": 4, 00:15:58.745 "num_base_bdevs_discovered": 0, 00:15:58.745 "num_base_bdevs_operational": 4, 00:15:58.745 "base_bdevs_list": [ 00:15:58.745 { 00:15:58.745 "name": "BaseBdev1", 00:15:58.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.745 "is_configured": false, 00:15:58.745 "data_offset": 0, 00:15:58.745 "data_size": 0 00:15:58.745 }, 00:15:58.745 { 00:15:58.745 "name": "BaseBdev2", 00:15:58.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.745 "is_configured": false, 00:15:58.745 "data_offset": 0, 00:15:58.745 "data_size": 0 00:15:58.745 }, 00:15:58.745 { 00:15:58.745 "name": "BaseBdev3", 00:15:58.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.745 "is_configured": false, 00:15:58.745 "data_offset": 0, 00:15:58.745 "data_size": 0 00:15:58.745 }, 00:15:58.745 { 00:15:58.745 "name": "BaseBdev4", 00:15:58.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.745 "is_configured": false, 00:15:58.745 "data_offset": 0, 00:15:58.745 "data_size": 0 00:15:58.745 } 00:15:58.745 ] 00:15:58.745 }' 00:15:58.745 10:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.745 10:11:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.315 10:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:59.575 [2024-06-10 10:11:21.230110] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:59.575 [2024-06-10 10:11:21.230126] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fdb20 name Existed_Raid, state configuring 00:15:59.575 10:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:59.575 [2024-06-10 10:11:21.422610] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:59.575 [2024-06-10 10:11:21.422627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:59.575 [2024-06-10 10:11:21.422632] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:59.575 [2024-06-10 10:11:21.422638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:59.575 [2024-06-10 10:11:21.422642] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:59.575 [2024-06-10 10:11:21.422648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:59.575 [2024-06-10 10:11:21.422653] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:59.575 [2024-06-10 10:11:21.422658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:59.575 10:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:59.835 [2024-06-10 10:11:21.621716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:59.835 BaseBdev1 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:59.835 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.095 10:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:00.356 [ 00:16:00.356 { 00:16:00.356 "name": "BaseBdev1", 00:16:00.356 "aliases": [ 00:16:00.356 "452b5331-8f83-4175-900a-78688dedaa09" 00:16:00.356 ], 00:16:00.356 "product_name": "Malloc disk", 00:16:00.356 "block_size": 512, 00:16:00.356 "num_blocks": 65536, 00:16:00.356 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:00.356 "assigned_rate_limits": { 00:16:00.356 "rw_ios_per_sec": 0, 00:16:00.356 "rw_mbytes_per_sec": 0, 00:16:00.356 "r_mbytes_per_sec": 0, 00:16:00.356 "w_mbytes_per_sec": 0 00:16:00.356 }, 00:16:00.356 "claimed": true, 00:16:00.356 "claim_type": "exclusive_write", 00:16:00.356 "zoned": false, 00:16:00.356 "supported_io_types": { 00:16:00.356 "read": true, 00:16:00.356 "write": true, 00:16:00.356 "unmap": true, 00:16:00.356 "write_zeroes": true, 00:16:00.356 "flush": true, 00:16:00.356 "reset": true, 00:16:00.356 "compare": false, 00:16:00.356 "compare_and_write": false, 00:16:00.356 "abort": true, 00:16:00.356 "nvme_admin": false, 00:16:00.356 "nvme_io": false 00:16:00.356 }, 00:16:00.356 "memory_domains": [ 00:16:00.356 { 00:16:00.356 "dma_device_id": "system", 00:16:00.356 "dma_device_type": 1 00:16:00.356 }, 00:16:00.356 { 00:16:00.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.356 "dma_device_type": 2 00:16:00.356 } 00:16:00.356 ], 00:16:00.356 "driver_specific": {} 00:16:00.356 } 00:16:00.356 ] 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.356 "name": "Existed_Raid", 00:16:00.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.356 "strip_size_kb": 64, 00:16:00.356 "state": "configuring", 00:16:00.356 "raid_level": "raid0", 00:16:00.356 "superblock": false, 00:16:00.356 "num_base_bdevs": 4, 00:16:00.356 "num_base_bdevs_discovered": 1, 00:16:00.356 "num_base_bdevs_operational": 4, 00:16:00.356 "base_bdevs_list": [ 00:16:00.356 { 00:16:00.356 "name": "BaseBdev1", 00:16:00.356 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:00.356 "is_configured": true, 00:16:00.356 "data_offset": 0, 00:16:00.356 "data_size": 65536 00:16:00.356 }, 00:16:00.356 { 00:16:00.356 "name": "BaseBdev2", 00:16:00.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.356 "is_configured": false, 00:16:00.356 "data_offset": 0, 00:16:00.356 "data_size": 0 00:16:00.356 }, 00:16:00.356 { 00:16:00.356 "name": "BaseBdev3", 00:16:00.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.356 "is_configured": false, 00:16:00.356 "data_offset": 0, 00:16:00.356 "data_size": 0 00:16:00.356 }, 00:16:00.356 { 00:16:00.356 "name": "BaseBdev4", 00:16:00.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.356 "is_configured": false, 00:16:00.356 "data_offset": 0, 00:16:00.356 "data_size": 0 00:16:00.356 } 00:16:00.356 ] 00:16:00.356 }' 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.356 10:11:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.924 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:01.184 [2024-06-10 10:11:22.864851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:01.184 [2024-06-10 10:11:22.864876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fd3b0 name Existed_Raid, state configuring 00:16:01.184 10:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:01.444 [2024-06-10 10:11:23.053360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.444 [2024-06-10 10:11:23.054506] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:01.444 [2024-06-10 10:11:23.054530] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:01.444 [2024-06-10 10:11:23.054536] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:01.444 [2024-06-10 10:11:23.054541] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:01.444 [2024-06-10 10:11:23.054546] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:01.444 [2024-06-10 10:11:23.054552] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.445 "name": "Existed_Raid", 00:16:01.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.445 "strip_size_kb": 64, 00:16:01.445 "state": "configuring", 00:16:01.445 "raid_level": "raid0", 00:16:01.445 "superblock": false, 00:16:01.445 "num_base_bdevs": 4, 00:16:01.445 "num_base_bdevs_discovered": 1, 00:16:01.445 "num_base_bdevs_operational": 4, 00:16:01.445 "base_bdevs_list": [ 00:16:01.445 { 00:16:01.445 "name": "BaseBdev1", 00:16:01.445 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:01.445 "is_configured": true, 00:16:01.445 "data_offset": 0, 00:16:01.445 "data_size": 65536 00:16:01.445 }, 00:16:01.445 { 00:16:01.445 "name": "BaseBdev2", 00:16:01.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.445 "is_configured": false, 00:16:01.445 "data_offset": 0, 00:16:01.445 "data_size": 0 00:16:01.445 }, 00:16:01.445 { 00:16:01.445 "name": "BaseBdev3", 00:16:01.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.445 "is_configured": false, 00:16:01.445 "data_offset": 0, 00:16:01.445 "data_size": 0 00:16:01.445 }, 00:16:01.445 { 00:16:01.445 "name": "BaseBdev4", 00:16:01.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.445 "is_configured": false, 00:16:01.445 "data_offset": 0, 00:16:01.445 "data_size": 0 00:16:01.445 } 00:16:01.445 ] 00:16:01.445 }' 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.445 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.013 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:02.273 [2024-06-10 10:11:23.972558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.273 BaseBdev2 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:02.273 10:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:02.532 10:11:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:02.532 [ 00:16:02.532 { 00:16:02.533 "name": "BaseBdev2", 00:16:02.533 "aliases": [ 00:16:02.533 "13c41eed-1c9b-45d1-ac63-1208f34a444f" 00:16:02.533 ], 00:16:02.533 "product_name": "Malloc disk", 00:16:02.533 "block_size": 512, 00:16:02.533 "num_blocks": 65536, 00:16:02.533 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:02.533 "assigned_rate_limits": { 00:16:02.533 "rw_ios_per_sec": 0, 00:16:02.533 "rw_mbytes_per_sec": 0, 00:16:02.533 "r_mbytes_per_sec": 0, 00:16:02.533 "w_mbytes_per_sec": 0 00:16:02.533 }, 00:16:02.533 "claimed": true, 00:16:02.533 "claim_type": "exclusive_write", 00:16:02.533 "zoned": false, 00:16:02.533 "supported_io_types": { 00:16:02.533 "read": true, 00:16:02.533 "write": true, 00:16:02.533 "unmap": true, 00:16:02.533 "write_zeroes": true, 00:16:02.533 "flush": true, 00:16:02.533 "reset": true, 00:16:02.533 "compare": false, 00:16:02.533 "compare_and_write": false, 00:16:02.533 "abort": true, 00:16:02.533 "nvme_admin": false, 00:16:02.533 "nvme_io": false 00:16:02.533 }, 00:16:02.533 "memory_domains": [ 00:16:02.533 { 00:16:02.533 "dma_device_id": "system", 00:16:02.533 "dma_device_type": 1 00:16:02.533 }, 00:16:02.533 { 00:16:02.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.533 "dma_device_type": 2 00:16:02.533 } 00:16:02.533 ], 00:16:02.533 "driver_specific": {} 00:16:02.533 } 00:16:02.533 ] 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.533 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.792 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.792 "name": "Existed_Raid", 00:16:02.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.792 "strip_size_kb": 64, 00:16:02.792 "state": "configuring", 00:16:02.792 "raid_level": "raid0", 00:16:02.792 "superblock": false, 00:16:02.792 "num_base_bdevs": 4, 00:16:02.792 "num_base_bdevs_discovered": 2, 00:16:02.792 "num_base_bdevs_operational": 4, 00:16:02.792 "base_bdevs_list": [ 00:16:02.792 { 00:16:02.792 "name": "BaseBdev1", 00:16:02.792 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:02.792 "is_configured": true, 00:16:02.792 "data_offset": 0, 00:16:02.792 "data_size": 65536 00:16:02.792 }, 00:16:02.792 { 00:16:02.792 "name": "BaseBdev2", 00:16:02.792 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:02.792 "is_configured": true, 00:16:02.792 "data_offset": 0, 00:16:02.792 "data_size": 65536 00:16:02.792 }, 00:16:02.792 { 00:16:02.792 "name": "BaseBdev3", 00:16:02.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.792 "is_configured": false, 00:16:02.792 "data_offset": 0, 00:16:02.792 "data_size": 0 00:16:02.792 }, 00:16:02.792 { 00:16:02.792 "name": "BaseBdev4", 00:16:02.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.792 "is_configured": false, 00:16:02.792 "data_offset": 0, 00:16:02.792 "data_size": 0 00:16:02.792 } 00:16:02.792 ] 00:16:02.792 }' 00:16:02.792 10:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.792 10:11:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.361 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:03.621 [2024-06-10 10:11:25.244629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.621 BaseBdev3 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.621 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:03.882 [ 00:16:03.882 { 00:16:03.882 "name": "BaseBdev3", 00:16:03.882 "aliases": [ 00:16:03.882 "ba68dfd4-d822-448d-ace4-2f967b4a9621" 00:16:03.882 ], 00:16:03.882 "product_name": "Malloc disk", 00:16:03.882 "block_size": 512, 00:16:03.882 "num_blocks": 65536, 00:16:03.882 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:03.882 "assigned_rate_limits": { 00:16:03.882 "rw_ios_per_sec": 0, 00:16:03.882 "rw_mbytes_per_sec": 0, 00:16:03.882 "r_mbytes_per_sec": 0, 00:16:03.882 "w_mbytes_per_sec": 0 00:16:03.882 }, 00:16:03.882 "claimed": true, 00:16:03.882 "claim_type": "exclusive_write", 00:16:03.882 "zoned": false, 00:16:03.882 "supported_io_types": { 00:16:03.882 "read": true, 00:16:03.882 "write": true, 00:16:03.882 "unmap": true, 00:16:03.882 "write_zeroes": true, 00:16:03.882 "flush": true, 00:16:03.882 "reset": true, 00:16:03.882 "compare": false, 00:16:03.882 "compare_and_write": false, 00:16:03.882 "abort": true, 00:16:03.882 "nvme_admin": false, 00:16:03.882 "nvme_io": false 00:16:03.882 }, 00:16:03.882 "memory_domains": [ 00:16:03.882 { 00:16:03.882 "dma_device_id": "system", 00:16:03.882 "dma_device_type": 1 00:16:03.882 }, 00:16:03.882 { 00:16:03.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.882 "dma_device_type": 2 00:16:03.882 } 00:16:03.882 ], 00:16:03.882 "driver_specific": {} 00:16:03.882 } 00:16:03.882 ] 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.882 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.142 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.142 "name": "Existed_Raid", 00:16:04.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.142 "strip_size_kb": 64, 00:16:04.142 "state": "configuring", 00:16:04.142 "raid_level": "raid0", 00:16:04.142 "superblock": false, 00:16:04.142 "num_base_bdevs": 4, 00:16:04.142 "num_base_bdevs_discovered": 3, 00:16:04.142 "num_base_bdevs_operational": 4, 00:16:04.142 "base_bdevs_list": [ 00:16:04.142 { 00:16:04.142 "name": "BaseBdev1", 00:16:04.142 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:04.142 "is_configured": true, 00:16:04.142 "data_offset": 0, 00:16:04.142 "data_size": 65536 00:16:04.142 }, 00:16:04.142 { 00:16:04.142 "name": "BaseBdev2", 00:16:04.142 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:04.142 "is_configured": true, 00:16:04.142 "data_offset": 0, 00:16:04.142 "data_size": 65536 00:16:04.142 }, 00:16:04.142 { 00:16:04.142 "name": "BaseBdev3", 00:16:04.142 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:04.142 "is_configured": true, 00:16:04.142 "data_offset": 0, 00:16:04.142 "data_size": 65536 00:16:04.142 }, 00:16:04.142 { 00:16:04.142 "name": "BaseBdev4", 00:16:04.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.142 "is_configured": false, 00:16:04.142 "data_offset": 0, 00:16:04.142 "data_size": 0 00:16:04.142 } 00:16:04.142 ] 00:16:04.142 }' 00:16:04.142 10:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.142 10:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.712 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:04.712 [2024-06-10 10:11:26.568970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:04.712 [2024-06-10 10:11:26.568992] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23fe4c0 00:16:04.712 [2024-06-10 10:11:26.568997] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:04.712 [2024-06-10 10:11:26.569143] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23f68a0 00:16:04.712 [2024-06-10 10:11:26.569236] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23fe4c0 00:16:04.712 [2024-06-10 10:11:26.569242] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23fe4c0 00:16:04.712 [2024-06-10 10:11:26.569359] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:04.712 BaseBdev4 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.972 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:05.232 [ 00:16:05.232 { 00:16:05.232 "name": "BaseBdev4", 00:16:05.232 "aliases": [ 00:16:05.232 "feaef65f-12b1-4450-942f-53778806f505" 00:16:05.232 ], 00:16:05.232 "product_name": "Malloc disk", 00:16:05.232 "block_size": 512, 00:16:05.232 "num_blocks": 65536, 00:16:05.232 "uuid": "feaef65f-12b1-4450-942f-53778806f505", 00:16:05.232 "assigned_rate_limits": { 00:16:05.232 "rw_ios_per_sec": 0, 00:16:05.232 "rw_mbytes_per_sec": 0, 00:16:05.232 "r_mbytes_per_sec": 0, 00:16:05.232 "w_mbytes_per_sec": 0 00:16:05.232 }, 00:16:05.232 "claimed": true, 00:16:05.232 "claim_type": "exclusive_write", 00:16:05.232 "zoned": false, 00:16:05.232 "supported_io_types": { 00:16:05.232 "read": true, 00:16:05.232 "write": true, 00:16:05.232 "unmap": true, 00:16:05.232 "write_zeroes": true, 00:16:05.232 "flush": true, 00:16:05.232 "reset": true, 00:16:05.232 "compare": false, 00:16:05.232 "compare_and_write": false, 00:16:05.232 "abort": true, 00:16:05.232 "nvme_admin": false, 00:16:05.232 "nvme_io": false 00:16:05.232 }, 00:16:05.232 "memory_domains": [ 00:16:05.232 { 00:16:05.232 "dma_device_id": "system", 00:16:05.232 "dma_device_type": 1 00:16:05.232 }, 00:16:05.232 { 00:16:05.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.232 "dma_device_type": 2 00:16:05.232 } 00:16:05.232 ], 00:16:05.232 "driver_specific": {} 00:16:05.232 } 00:16:05.232 ] 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.232 10:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.493 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.493 "name": "Existed_Raid", 00:16:05.493 "uuid": "e9425062-31d4-4d80-887b-b9a23f76cc08", 00:16:05.493 "strip_size_kb": 64, 00:16:05.493 "state": "online", 00:16:05.493 "raid_level": "raid0", 00:16:05.493 "superblock": false, 00:16:05.493 "num_base_bdevs": 4, 00:16:05.493 "num_base_bdevs_discovered": 4, 00:16:05.493 "num_base_bdevs_operational": 4, 00:16:05.493 "base_bdevs_list": [ 00:16:05.493 { 00:16:05.493 "name": "BaseBdev1", 00:16:05.493 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:05.493 "is_configured": true, 00:16:05.493 "data_offset": 0, 00:16:05.493 "data_size": 65536 00:16:05.493 }, 00:16:05.493 { 00:16:05.493 "name": "BaseBdev2", 00:16:05.493 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:05.493 "is_configured": true, 00:16:05.493 "data_offset": 0, 00:16:05.493 "data_size": 65536 00:16:05.493 }, 00:16:05.493 { 00:16:05.493 "name": "BaseBdev3", 00:16:05.493 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:05.493 "is_configured": true, 00:16:05.493 "data_offset": 0, 00:16:05.493 "data_size": 65536 00:16:05.493 }, 00:16:05.493 { 00:16:05.493 "name": "BaseBdev4", 00:16:05.493 "uuid": "feaef65f-12b1-4450-942f-53778806f505", 00:16:05.493 "is_configured": true, 00:16:05.493 "data_offset": 0, 00:16:05.493 "data_size": 65536 00:16:05.493 } 00:16:05.493 ] 00:16:05.493 }' 00:16:05.493 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.493 10:11:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:06.064 [2024-06-10 10:11:27.836410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:06.064 "name": "Existed_Raid", 00:16:06.064 "aliases": [ 00:16:06.064 "e9425062-31d4-4d80-887b-b9a23f76cc08" 00:16:06.064 ], 00:16:06.064 "product_name": "Raid Volume", 00:16:06.064 "block_size": 512, 00:16:06.064 "num_blocks": 262144, 00:16:06.064 "uuid": "e9425062-31d4-4d80-887b-b9a23f76cc08", 00:16:06.064 "assigned_rate_limits": { 00:16:06.064 "rw_ios_per_sec": 0, 00:16:06.064 "rw_mbytes_per_sec": 0, 00:16:06.064 "r_mbytes_per_sec": 0, 00:16:06.064 "w_mbytes_per_sec": 0 00:16:06.064 }, 00:16:06.064 "claimed": false, 00:16:06.064 "zoned": false, 00:16:06.064 "supported_io_types": { 00:16:06.064 "read": true, 00:16:06.064 "write": true, 00:16:06.064 "unmap": true, 00:16:06.064 "write_zeroes": true, 00:16:06.064 "flush": true, 00:16:06.064 "reset": true, 00:16:06.064 "compare": false, 00:16:06.064 "compare_and_write": false, 00:16:06.064 "abort": false, 00:16:06.064 "nvme_admin": false, 00:16:06.064 "nvme_io": false 00:16:06.064 }, 00:16:06.064 "memory_domains": [ 00:16:06.064 { 00:16:06.064 "dma_device_id": "system", 00:16:06.064 "dma_device_type": 1 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.064 "dma_device_type": 2 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "system", 00:16:06.064 "dma_device_type": 1 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.064 "dma_device_type": 2 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "system", 00:16:06.064 "dma_device_type": 1 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.064 "dma_device_type": 2 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "system", 00:16:06.064 "dma_device_type": 1 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.064 "dma_device_type": 2 00:16:06.064 } 00:16:06.064 ], 00:16:06.064 "driver_specific": { 00:16:06.064 "raid": { 00:16:06.064 "uuid": "e9425062-31d4-4d80-887b-b9a23f76cc08", 00:16:06.064 "strip_size_kb": 64, 00:16:06.064 "state": "online", 00:16:06.064 "raid_level": "raid0", 00:16:06.064 "superblock": false, 00:16:06.064 "num_base_bdevs": 4, 00:16:06.064 "num_base_bdevs_discovered": 4, 00:16:06.064 "num_base_bdevs_operational": 4, 00:16:06.064 "base_bdevs_list": [ 00:16:06.064 { 00:16:06.064 "name": "BaseBdev1", 00:16:06.064 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:06.064 "is_configured": true, 00:16:06.064 "data_offset": 0, 00:16:06.064 "data_size": 65536 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "name": "BaseBdev2", 00:16:06.064 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:06.064 "is_configured": true, 00:16:06.064 "data_offset": 0, 00:16:06.064 "data_size": 65536 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "name": "BaseBdev3", 00:16:06.064 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:06.064 "is_configured": true, 00:16:06.064 "data_offset": 0, 00:16:06.064 "data_size": 65536 00:16:06.064 }, 00:16:06.064 { 00:16:06.064 "name": "BaseBdev4", 00:16:06.064 "uuid": "feaef65f-12b1-4450-942f-53778806f505", 00:16:06.064 "is_configured": true, 00:16:06.064 "data_offset": 0, 00:16:06.064 "data_size": 65536 00:16:06.064 } 00:16:06.064 ] 00:16:06.064 } 00:16:06.064 } 00:16:06.064 }' 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:06.064 BaseBdev2 00:16:06.064 BaseBdev3 00:16:06.064 BaseBdev4' 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:06.064 10:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.325 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.325 "name": "BaseBdev1", 00:16:06.325 "aliases": [ 00:16:06.325 "452b5331-8f83-4175-900a-78688dedaa09" 00:16:06.325 ], 00:16:06.325 "product_name": "Malloc disk", 00:16:06.325 "block_size": 512, 00:16:06.325 "num_blocks": 65536, 00:16:06.325 "uuid": "452b5331-8f83-4175-900a-78688dedaa09", 00:16:06.325 "assigned_rate_limits": { 00:16:06.325 "rw_ios_per_sec": 0, 00:16:06.325 "rw_mbytes_per_sec": 0, 00:16:06.325 "r_mbytes_per_sec": 0, 00:16:06.325 "w_mbytes_per_sec": 0 00:16:06.325 }, 00:16:06.325 "claimed": true, 00:16:06.325 "claim_type": "exclusive_write", 00:16:06.325 "zoned": false, 00:16:06.325 "supported_io_types": { 00:16:06.325 "read": true, 00:16:06.325 "write": true, 00:16:06.325 "unmap": true, 00:16:06.325 "write_zeroes": true, 00:16:06.325 "flush": true, 00:16:06.325 "reset": true, 00:16:06.325 "compare": false, 00:16:06.325 "compare_and_write": false, 00:16:06.325 "abort": true, 00:16:06.325 "nvme_admin": false, 00:16:06.325 "nvme_io": false 00:16:06.325 }, 00:16:06.325 "memory_domains": [ 00:16:06.325 { 00:16:06.325 "dma_device_id": "system", 00:16:06.325 "dma_device_type": 1 00:16:06.325 }, 00:16:06.325 { 00:16:06.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.325 "dma_device_type": 2 00:16:06.325 } 00:16:06.325 ], 00:16:06.325 "driver_specific": {} 00:16:06.325 }' 00:16:06.325 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.325 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.325 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.325 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:06.585 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.846 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.846 "name": "BaseBdev2", 00:16:06.846 "aliases": [ 00:16:06.846 "13c41eed-1c9b-45d1-ac63-1208f34a444f" 00:16:06.846 ], 00:16:06.846 "product_name": "Malloc disk", 00:16:06.846 "block_size": 512, 00:16:06.846 "num_blocks": 65536, 00:16:06.846 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:06.846 "assigned_rate_limits": { 00:16:06.846 "rw_ios_per_sec": 0, 00:16:06.846 "rw_mbytes_per_sec": 0, 00:16:06.846 "r_mbytes_per_sec": 0, 00:16:06.846 "w_mbytes_per_sec": 0 00:16:06.846 }, 00:16:06.846 "claimed": true, 00:16:06.846 "claim_type": "exclusive_write", 00:16:06.846 "zoned": false, 00:16:06.846 "supported_io_types": { 00:16:06.846 "read": true, 00:16:06.846 "write": true, 00:16:06.846 "unmap": true, 00:16:06.846 "write_zeroes": true, 00:16:06.846 "flush": true, 00:16:06.846 "reset": true, 00:16:06.846 "compare": false, 00:16:06.846 "compare_and_write": false, 00:16:06.846 "abort": true, 00:16:06.846 "nvme_admin": false, 00:16:06.846 "nvme_io": false 00:16:06.846 }, 00:16:06.846 "memory_domains": [ 00:16:06.846 { 00:16:06.846 "dma_device_id": "system", 00:16:06.846 "dma_device_type": 1 00:16:06.846 }, 00:16:06.846 { 00:16:06.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.846 "dma_device_type": 2 00:16:06.846 } 00:16:06.846 ], 00:16:06.846 "driver_specific": {} 00:16:06.846 }' 00:16:06.846 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.846 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.846 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.846 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:07.106 10:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.366 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.366 "name": "BaseBdev3", 00:16:07.366 "aliases": [ 00:16:07.366 "ba68dfd4-d822-448d-ace4-2f967b4a9621" 00:16:07.366 ], 00:16:07.366 "product_name": "Malloc disk", 00:16:07.366 "block_size": 512, 00:16:07.366 "num_blocks": 65536, 00:16:07.366 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:07.366 "assigned_rate_limits": { 00:16:07.366 "rw_ios_per_sec": 0, 00:16:07.366 "rw_mbytes_per_sec": 0, 00:16:07.366 "r_mbytes_per_sec": 0, 00:16:07.366 "w_mbytes_per_sec": 0 00:16:07.366 }, 00:16:07.366 "claimed": true, 00:16:07.366 "claim_type": "exclusive_write", 00:16:07.366 "zoned": false, 00:16:07.366 "supported_io_types": { 00:16:07.366 "read": true, 00:16:07.366 "write": true, 00:16:07.366 "unmap": true, 00:16:07.366 "write_zeroes": true, 00:16:07.366 "flush": true, 00:16:07.366 "reset": true, 00:16:07.366 "compare": false, 00:16:07.366 "compare_and_write": false, 00:16:07.367 "abort": true, 00:16:07.367 "nvme_admin": false, 00:16:07.367 "nvme_io": false 00:16:07.367 }, 00:16:07.367 "memory_domains": [ 00:16:07.367 { 00:16:07.367 "dma_device_id": "system", 00:16:07.367 "dma_device_type": 1 00:16:07.367 }, 00:16:07.367 { 00:16:07.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.367 "dma_device_type": 2 00:16:07.367 } 00:16:07.367 ], 00:16:07.367 "driver_specific": {} 00:16:07.367 }' 00:16:07.367 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.367 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.367 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.367 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.367 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:07.627 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.887 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.887 "name": "BaseBdev4", 00:16:07.887 "aliases": [ 00:16:07.887 "feaef65f-12b1-4450-942f-53778806f505" 00:16:07.887 ], 00:16:07.887 "product_name": "Malloc disk", 00:16:07.887 "block_size": 512, 00:16:07.887 "num_blocks": 65536, 00:16:07.887 "uuid": "feaef65f-12b1-4450-942f-53778806f505", 00:16:07.887 "assigned_rate_limits": { 00:16:07.887 "rw_ios_per_sec": 0, 00:16:07.887 "rw_mbytes_per_sec": 0, 00:16:07.887 "r_mbytes_per_sec": 0, 00:16:07.887 "w_mbytes_per_sec": 0 00:16:07.887 }, 00:16:07.887 "claimed": true, 00:16:07.887 "claim_type": "exclusive_write", 00:16:07.887 "zoned": false, 00:16:07.887 "supported_io_types": { 00:16:07.887 "read": true, 00:16:07.887 "write": true, 00:16:07.887 "unmap": true, 00:16:07.887 "write_zeroes": true, 00:16:07.887 "flush": true, 00:16:07.887 "reset": true, 00:16:07.887 "compare": false, 00:16:07.887 "compare_and_write": false, 00:16:07.887 "abort": true, 00:16:07.887 "nvme_admin": false, 00:16:07.887 "nvme_io": false 00:16:07.887 }, 00:16:07.887 "memory_domains": [ 00:16:07.887 { 00:16:07.887 "dma_device_id": "system", 00:16:07.887 "dma_device_type": 1 00:16:07.887 }, 00:16:07.887 { 00:16:07.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.887 "dma_device_type": 2 00:16:07.887 } 00:16:07.887 ], 00:16:07.887 "driver_specific": {} 00:16:07.887 }' 00:16:07.887 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.887 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.887 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.887 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.148 10:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.409 [2024-06-10 10:11:30.138237] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.409 [2024-06-10 10:11:30.138258] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:08.409 [2024-06-10 10:11:30.138299] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.409 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.670 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.670 "name": "Existed_Raid", 00:16:08.670 "uuid": "e9425062-31d4-4d80-887b-b9a23f76cc08", 00:16:08.670 "strip_size_kb": 64, 00:16:08.670 "state": "offline", 00:16:08.670 "raid_level": "raid0", 00:16:08.670 "superblock": false, 00:16:08.670 "num_base_bdevs": 4, 00:16:08.670 "num_base_bdevs_discovered": 3, 00:16:08.670 "num_base_bdevs_operational": 3, 00:16:08.670 "base_bdevs_list": [ 00:16:08.670 { 00:16:08.670 "name": null, 00:16:08.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.670 "is_configured": false, 00:16:08.670 "data_offset": 0, 00:16:08.670 "data_size": 65536 00:16:08.670 }, 00:16:08.670 { 00:16:08.670 "name": "BaseBdev2", 00:16:08.670 "uuid": "13c41eed-1c9b-45d1-ac63-1208f34a444f", 00:16:08.670 "is_configured": true, 00:16:08.670 "data_offset": 0, 00:16:08.670 "data_size": 65536 00:16:08.670 }, 00:16:08.670 { 00:16:08.670 "name": "BaseBdev3", 00:16:08.670 "uuid": "ba68dfd4-d822-448d-ace4-2f967b4a9621", 00:16:08.670 "is_configured": true, 00:16:08.670 "data_offset": 0, 00:16:08.670 "data_size": 65536 00:16:08.670 }, 00:16:08.670 { 00:16:08.670 "name": "BaseBdev4", 00:16:08.670 "uuid": "feaef65f-12b1-4450-942f-53778806f505", 00:16:08.670 "is_configured": true, 00:16:08.670 "data_offset": 0, 00:16:08.670 "data_size": 65536 00:16:08.670 } 00:16:08.670 ] 00:16:08.670 }' 00:16:08.670 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.670 10:11:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.241 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:09.241 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.241 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.241 10:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.241 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.241 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.241 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:09.502 [2024-06-10 10:11:31.237005] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:09.502 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:09.502 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.502 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.502 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.763 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.763 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.763 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:09.763 [2024-06-10 10:11:31.619773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:10.024 10:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:10.286 [2024-06-10 10:11:32.006622] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:10.286 [2024-06-10 10:11:32.006649] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fe4c0 name Existed_Raid, state offline 00:16:10.286 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.286 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.286 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.286 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:10.545 BaseBdev2 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:10.545 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.806 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:11.067 [ 00:16:11.067 { 00:16:11.067 "name": "BaseBdev2", 00:16:11.067 "aliases": [ 00:16:11.067 "9663027b-2664-4a3f-a3e8-c26d49f02365" 00:16:11.067 ], 00:16:11.067 "product_name": "Malloc disk", 00:16:11.067 "block_size": 512, 00:16:11.067 "num_blocks": 65536, 00:16:11.067 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:11.067 "assigned_rate_limits": { 00:16:11.067 "rw_ios_per_sec": 0, 00:16:11.067 "rw_mbytes_per_sec": 0, 00:16:11.067 "r_mbytes_per_sec": 0, 00:16:11.067 "w_mbytes_per_sec": 0 00:16:11.067 }, 00:16:11.067 "claimed": false, 00:16:11.067 "zoned": false, 00:16:11.067 "supported_io_types": { 00:16:11.067 "read": true, 00:16:11.067 "write": true, 00:16:11.067 "unmap": true, 00:16:11.067 "write_zeroes": true, 00:16:11.067 "flush": true, 00:16:11.067 "reset": true, 00:16:11.067 "compare": false, 00:16:11.067 "compare_and_write": false, 00:16:11.067 "abort": true, 00:16:11.067 "nvme_admin": false, 00:16:11.067 "nvme_io": false 00:16:11.067 }, 00:16:11.067 "memory_domains": [ 00:16:11.067 { 00:16:11.067 "dma_device_id": "system", 00:16:11.067 "dma_device_type": 1 00:16:11.067 }, 00:16:11.067 { 00:16:11.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.067 "dma_device_type": 2 00:16:11.067 } 00:16:11.067 ], 00:16:11.067 "driver_specific": {} 00:16:11.067 } 00:16:11.067 ] 00:16:11.067 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:11.067 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.067 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.067 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:11.338 BaseBdev3 00:16:11.338 10:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:11.339 10:11:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.339 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:11.608 [ 00:16:11.608 { 00:16:11.608 "name": "BaseBdev3", 00:16:11.608 "aliases": [ 00:16:11.608 "aac91ac1-263e-4766-a68d-b313228a0398" 00:16:11.608 ], 00:16:11.608 "product_name": "Malloc disk", 00:16:11.608 "block_size": 512, 00:16:11.608 "num_blocks": 65536, 00:16:11.608 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:11.608 "assigned_rate_limits": { 00:16:11.608 "rw_ios_per_sec": 0, 00:16:11.608 "rw_mbytes_per_sec": 0, 00:16:11.608 "r_mbytes_per_sec": 0, 00:16:11.608 "w_mbytes_per_sec": 0 00:16:11.608 }, 00:16:11.608 "claimed": false, 00:16:11.608 "zoned": false, 00:16:11.608 "supported_io_types": { 00:16:11.608 "read": true, 00:16:11.608 "write": true, 00:16:11.608 "unmap": true, 00:16:11.608 "write_zeroes": true, 00:16:11.608 "flush": true, 00:16:11.608 "reset": true, 00:16:11.608 "compare": false, 00:16:11.608 "compare_and_write": false, 00:16:11.608 "abort": true, 00:16:11.608 "nvme_admin": false, 00:16:11.608 "nvme_io": false 00:16:11.608 }, 00:16:11.608 "memory_domains": [ 00:16:11.608 { 00:16:11.608 "dma_device_id": "system", 00:16:11.608 "dma_device_type": 1 00:16:11.608 }, 00:16:11.608 { 00:16:11.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.608 "dma_device_type": 2 00:16:11.608 } 00:16:11.608 ], 00:16:11.608 "driver_specific": {} 00:16:11.608 } 00:16:11.608 ] 00:16:11.608 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:11.608 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.608 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.608 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:11.869 BaseBdev4 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.869 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:12.134 [ 00:16:12.134 { 00:16:12.134 "name": "BaseBdev4", 00:16:12.134 "aliases": [ 00:16:12.134 "23dd3227-1cac-4ba5-be78-fead955ae5ce" 00:16:12.134 ], 00:16:12.134 "product_name": "Malloc disk", 00:16:12.134 "block_size": 512, 00:16:12.134 "num_blocks": 65536, 00:16:12.134 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:12.134 "assigned_rate_limits": { 00:16:12.134 "rw_ios_per_sec": 0, 00:16:12.134 "rw_mbytes_per_sec": 0, 00:16:12.134 "r_mbytes_per_sec": 0, 00:16:12.134 "w_mbytes_per_sec": 0 00:16:12.134 }, 00:16:12.134 "claimed": false, 00:16:12.134 "zoned": false, 00:16:12.134 "supported_io_types": { 00:16:12.134 "read": true, 00:16:12.134 "write": true, 00:16:12.134 "unmap": true, 00:16:12.134 "write_zeroes": true, 00:16:12.134 "flush": true, 00:16:12.134 "reset": true, 00:16:12.134 "compare": false, 00:16:12.134 "compare_and_write": false, 00:16:12.134 "abort": true, 00:16:12.134 "nvme_admin": false, 00:16:12.134 "nvme_io": false 00:16:12.134 }, 00:16:12.134 "memory_domains": [ 00:16:12.134 { 00:16:12.134 "dma_device_id": "system", 00:16:12.134 "dma_device_type": 1 00:16:12.134 }, 00:16:12.134 { 00:16:12.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.134 "dma_device_type": 2 00:16:12.134 } 00:16:12.134 ], 00:16:12.134 "driver_specific": {} 00:16:12.134 } 00:16:12.134 ] 00:16:12.134 10:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:12.134 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:12.134 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:12.134 10:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:12.395 [2024-06-10 10:11:34.053484] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:12.395 [2024-06-10 10:11:34.053512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:12.395 [2024-06-10 10:11:34.053524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.396 [2024-06-10 10:11:34.054546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.396 [2024-06-10 10:11:34.054577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.396 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.656 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.656 "name": "Existed_Raid", 00:16:12.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.656 "strip_size_kb": 64, 00:16:12.656 "state": "configuring", 00:16:12.656 "raid_level": "raid0", 00:16:12.656 "superblock": false, 00:16:12.656 "num_base_bdevs": 4, 00:16:12.656 "num_base_bdevs_discovered": 3, 00:16:12.656 "num_base_bdevs_operational": 4, 00:16:12.656 "base_bdevs_list": [ 00:16:12.656 { 00:16:12.656 "name": "BaseBdev1", 00:16:12.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.656 "is_configured": false, 00:16:12.656 "data_offset": 0, 00:16:12.656 "data_size": 0 00:16:12.656 }, 00:16:12.656 { 00:16:12.656 "name": "BaseBdev2", 00:16:12.656 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:12.656 "is_configured": true, 00:16:12.656 "data_offset": 0, 00:16:12.656 "data_size": 65536 00:16:12.656 }, 00:16:12.656 { 00:16:12.656 "name": "BaseBdev3", 00:16:12.656 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:12.656 "is_configured": true, 00:16:12.656 "data_offset": 0, 00:16:12.656 "data_size": 65536 00:16:12.657 }, 00:16:12.657 { 00:16:12.657 "name": "BaseBdev4", 00:16:12.657 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:12.657 "is_configured": true, 00:16:12.657 "data_offset": 0, 00:16:12.657 "data_size": 65536 00:16:12.657 } 00:16:12.657 ] 00:16:12.657 }' 00:16:12.657 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.657 10:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.917 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:13.178 [2024-06-10 10:11:34.951726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.178 10:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.439 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.439 "name": "Existed_Raid", 00:16:13.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.439 "strip_size_kb": 64, 00:16:13.439 "state": "configuring", 00:16:13.439 "raid_level": "raid0", 00:16:13.439 "superblock": false, 00:16:13.439 "num_base_bdevs": 4, 00:16:13.439 "num_base_bdevs_discovered": 2, 00:16:13.439 "num_base_bdevs_operational": 4, 00:16:13.439 "base_bdevs_list": [ 00:16:13.439 { 00:16:13.439 "name": "BaseBdev1", 00:16:13.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.439 "is_configured": false, 00:16:13.439 "data_offset": 0, 00:16:13.439 "data_size": 0 00:16:13.439 }, 00:16:13.439 { 00:16:13.439 "name": null, 00:16:13.439 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:13.439 "is_configured": false, 00:16:13.439 "data_offset": 0, 00:16:13.439 "data_size": 65536 00:16:13.439 }, 00:16:13.439 { 00:16:13.439 "name": "BaseBdev3", 00:16:13.439 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:13.439 "is_configured": true, 00:16:13.439 "data_offset": 0, 00:16:13.439 "data_size": 65536 00:16:13.439 }, 00:16:13.439 { 00:16:13.439 "name": "BaseBdev4", 00:16:13.439 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:13.439 "is_configured": true, 00:16:13.439 "data_offset": 0, 00:16:13.439 "data_size": 65536 00:16:13.439 } 00:16:13.439 ] 00:16:13.439 }' 00:16:13.439 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.440 10:11:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.011 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.011 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:14.272 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:14.272 10:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:14.272 [2024-06-10 10:11:36.103607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:14.272 BaseBdev1 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:14.272 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.533 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:14.794 [ 00:16:14.794 { 00:16:14.794 "name": "BaseBdev1", 00:16:14.794 "aliases": [ 00:16:14.794 "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7" 00:16:14.794 ], 00:16:14.794 "product_name": "Malloc disk", 00:16:14.794 "block_size": 512, 00:16:14.794 "num_blocks": 65536, 00:16:14.794 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:14.794 "assigned_rate_limits": { 00:16:14.794 "rw_ios_per_sec": 0, 00:16:14.794 "rw_mbytes_per_sec": 0, 00:16:14.794 "r_mbytes_per_sec": 0, 00:16:14.794 "w_mbytes_per_sec": 0 00:16:14.794 }, 00:16:14.794 "claimed": true, 00:16:14.794 "claim_type": "exclusive_write", 00:16:14.794 "zoned": false, 00:16:14.794 "supported_io_types": { 00:16:14.794 "read": true, 00:16:14.794 "write": true, 00:16:14.794 "unmap": true, 00:16:14.794 "write_zeroes": true, 00:16:14.794 "flush": true, 00:16:14.794 "reset": true, 00:16:14.794 "compare": false, 00:16:14.794 "compare_and_write": false, 00:16:14.794 "abort": true, 00:16:14.794 "nvme_admin": false, 00:16:14.794 "nvme_io": false 00:16:14.794 }, 00:16:14.794 "memory_domains": [ 00:16:14.794 { 00:16:14.794 "dma_device_id": "system", 00:16:14.794 "dma_device_type": 1 00:16:14.794 }, 00:16:14.794 { 00:16:14.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.794 "dma_device_type": 2 00:16:14.794 } 00:16:14.794 ], 00:16:14.794 "driver_specific": {} 00:16:14.794 } 00:16:14.794 ] 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.794 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.055 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.055 "name": "Existed_Raid", 00:16:15.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.055 "strip_size_kb": 64, 00:16:15.055 "state": "configuring", 00:16:15.055 "raid_level": "raid0", 00:16:15.055 "superblock": false, 00:16:15.055 "num_base_bdevs": 4, 00:16:15.055 "num_base_bdevs_discovered": 3, 00:16:15.055 "num_base_bdevs_operational": 4, 00:16:15.055 "base_bdevs_list": [ 00:16:15.055 { 00:16:15.055 "name": "BaseBdev1", 00:16:15.055 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:15.055 "is_configured": true, 00:16:15.055 "data_offset": 0, 00:16:15.055 "data_size": 65536 00:16:15.055 }, 00:16:15.055 { 00:16:15.055 "name": null, 00:16:15.055 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:15.055 "is_configured": false, 00:16:15.055 "data_offset": 0, 00:16:15.055 "data_size": 65536 00:16:15.055 }, 00:16:15.055 { 00:16:15.055 "name": "BaseBdev3", 00:16:15.055 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:15.055 "is_configured": true, 00:16:15.055 "data_offset": 0, 00:16:15.055 "data_size": 65536 00:16:15.055 }, 00:16:15.055 { 00:16:15.055 "name": "BaseBdev4", 00:16:15.055 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:15.055 "is_configured": true, 00:16:15.055 "data_offset": 0, 00:16:15.055 "data_size": 65536 00:16:15.055 } 00:16:15.055 ] 00:16:15.055 }' 00:16:15.055 10:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.056 10:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.626 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.626 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:15.626 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:15.626 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:15.886 [2024-06-10 10:11:37.603415] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.886 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.147 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.147 "name": "Existed_Raid", 00:16:16.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.147 "strip_size_kb": 64, 00:16:16.147 "state": "configuring", 00:16:16.147 "raid_level": "raid0", 00:16:16.147 "superblock": false, 00:16:16.147 "num_base_bdevs": 4, 00:16:16.147 "num_base_bdevs_discovered": 2, 00:16:16.147 "num_base_bdevs_operational": 4, 00:16:16.147 "base_bdevs_list": [ 00:16:16.147 { 00:16:16.147 "name": "BaseBdev1", 00:16:16.147 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:16.147 "is_configured": true, 00:16:16.147 "data_offset": 0, 00:16:16.147 "data_size": 65536 00:16:16.147 }, 00:16:16.147 { 00:16:16.147 "name": null, 00:16:16.147 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:16.147 "is_configured": false, 00:16:16.147 "data_offset": 0, 00:16:16.147 "data_size": 65536 00:16:16.147 }, 00:16:16.147 { 00:16:16.147 "name": null, 00:16:16.147 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:16.147 "is_configured": false, 00:16:16.147 "data_offset": 0, 00:16:16.147 "data_size": 65536 00:16:16.147 }, 00:16:16.147 { 00:16:16.147 "name": "BaseBdev4", 00:16:16.147 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:16.147 "is_configured": true, 00:16:16.147 "data_offset": 0, 00:16:16.147 "data_size": 65536 00:16:16.147 } 00:16:16.147 ] 00:16:16.147 }' 00:16:16.147 10:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.147 10:11:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.776 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.776 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:16.776 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:16.776 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:17.037 [2024-06-10 10:11:38.742299] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.037 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.298 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.298 "name": "Existed_Raid", 00:16:17.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.298 "strip_size_kb": 64, 00:16:17.298 "state": "configuring", 00:16:17.298 "raid_level": "raid0", 00:16:17.298 "superblock": false, 00:16:17.298 "num_base_bdevs": 4, 00:16:17.298 "num_base_bdevs_discovered": 3, 00:16:17.298 "num_base_bdevs_operational": 4, 00:16:17.298 "base_bdevs_list": [ 00:16:17.298 { 00:16:17.298 "name": "BaseBdev1", 00:16:17.298 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:17.298 "is_configured": true, 00:16:17.298 "data_offset": 0, 00:16:17.298 "data_size": 65536 00:16:17.298 }, 00:16:17.298 { 00:16:17.298 "name": null, 00:16:17.298 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:17.298 "is_configured": false, 00:16:17.298 "data_offset": 0, 00:16:17.298 "data_size": 65536 00:16:17.298 }, 00:16:17.298 { 00:16:17.298 "name": "BaseBdev3", 00:16:17.298 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:17.298 "is_configured": true, 00:16:17.298 "data_offset": 0, 00:16:17.298 "data_size": 65536 00:16:17.298 }, 00:16:17.298 { 00:16:17.298 "name": "BaseBdev4", 00:16:17.298 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:17.298 "is_configured": true, 00:16:17.298 "data_offset": 0, 00:16:17.298 "data_size": 65536 00:16:17.298 } 00:16:17.298 ] 00:16:17.298 }' 00:16:17.298 10:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.298 10:11:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.870 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.870 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:17.870 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:17.870 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:18.131 [2024-06-10 10:11:39.817037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.131 10:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.393 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.393 "name": "Existed_Raid", 00:16:18.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.393 "strip_size_kb": 64, 00:16:18.393 "state": "configuring", 00:16:18.393 "raid_level": "raid0", 00:16:18.393 "superblock": false, 00:16:18.393 "num_base_bdevs": 4, 00:16:18.393 "num_base_bdevs_discovered": 2, 00:16:18.393 "num_base_bdevs_operational": 4, 00:16:18.393 "base_bdevs_list": [ 00:16:18.393 { 00:16:18.393 "name": null, 00:16:18.393 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:18.393 "is_configured": false, 00:16:18.393 "data_offset": 0, 00:16:18.393 "data_size": 65536 00:16:18.393 }, 00:16:18.393 { 00:16:18.393 "name": null, 00:16:18.393 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:18.393 "is_configured": false, 00:16:18.393 "data_offset": 0, 00:16:18.393 "data_size": 65536 00:16:18.393 }, 00:16:18.393 { 00:16:18.393 "name": "BaseBdev3", 00:16:18.393 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:18.393 "is_configured": true, 00:16:18.393 "data_offset": 0, 00:16:18.393 "data_size": 65536 00:16:18.393 }, 00:16:18.393 { 00:16:18.393 "name": "BaseBdev4", 00:16:18.393 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:18.393 "is_configured": true, 00:16:18.393 "data_offset": 0, 00:16:18.393 "data_size": 65536 00:16:18.393 } 00:16:18.393 ] 00:16:18.393 }' 00:16:18.393 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.393 10:11:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.964 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.964 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:18.964 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:18.964 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:19.225 [2024-06-10 10:11:40.909641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.225 10:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.486 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.486 "name": "Existed_Raid", 00:16:19.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.486 "strip_size_kb": 64, 00:16:19.486 "state": "configuring", 00:16:19.486 "raid_level": "raid0", 00:16:19.486 "superblock": false, 00:16:19.486 "num_base_bdevs": 4, 00:16:19.486 "num_base_bdevs_discovered": 3, 00:16:19.486 "num_base_bdevs_operational": 4, 00:16:19.486 "base_bdevs_list": [ 00:16:19.486 { 00:16:19.486 "name": null, 00:16:19.486 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:19.486 "is_configured": false, 00:16:19.486 "data_offset": 0, 00:16:19.486 "data_size": 65536 00:16:19.486 }, 00:16:19.486 { 00:16:19.486 "name": "BaseBdev2", 00:16:19.486 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:19.486 "is_configured": true, 00:16:19.486 "data_offset": 0, 00:16:19.486 "data_size": 65536 00:16:19.486 }, 00:16:19.486 { 00:16:19.486 "name": "BaseBdev3", 00:16:19.486 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:19.486 "is_configured": true, 00:16:19.486 "data_offset": 0, 00:16:19.486 "data_size": 65536 00:16:19.486 }, 00:16:19.486 { 00:16:19.486 "name": "BaseBdev4", 00:16:19.486 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:19.486 "is_configured": true, 00:16:19.486 "data_offset": 0, 00:16:19.486 "data_size": 65536 00:16:19.486 } 00:16:19.486 ] 00:16:19.486 }' 00:16:19.486 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.486 10:11:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.058 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.058 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:20.058 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:20.058 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.058 10:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:20.319 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7 00:16:20.579 [2024-06-10 10:11:42.233910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:20.579 [2024-06-10 10:11:42.233932] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23f7490 00:16:20.579 [2024-06-10 10:11:42.233941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:20.579 [2024-06-10 10:11:42.234089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23f7ca0 00:16:20.579 [2024-06-10 10:11:42.234180] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23f7490 00:16:20.579 [2024-06-10 10:11:42.234185] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23f7490 00:16:20.579 [2024-06-10 10:11:42.234302] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:20.579 NewBaseBdev 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:20.579 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:20.839 [ 00:16:20.839 { 00:16:20.839 "name": "NewBaseBdev", 00:16:20.839 "aliases": [ 00:16:20.839 "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7" 00:16:20.839 ], 00:16:20.839 "product_name": "Malloc disk", 00:16:20.839 "block_size": 512, 00:16:20.839 "num_blocks": 65536, 00:16:20.839 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:20.839 "assigned_rate_limits": { 00:16:20.839 "rw_ios_per_sec": 0, 00:16:20.839 "rw_mbytes_per_sec": 0, 00:16:20.839 "r_mbytes_per_sec": 0, 00:16:20.839 "w_mbytes_per_sec": 0 00:16:20.839 }, 00:16:20.839 "claimed": true, 00:16:20.839 "claim_type": "exclusive_write", 00:16:20.839 "zoned": false, 00:16:20.839 "supported_io_types": { 00:16:20.839 "read": true, 00:16:20.839 "write": true, 00:16:20.839 "unmap": true, 00:16:20.839 "write_zeroes": true, 00:16:20.839 "flush": true, 00:16:20.839 "reset": true, 00:16:20.839 "compare": false, 00:16:20.839 "compare_and_write": false, 00:16:20.839 "abort": true, 00:16:20.839 "nvme_admin": false, 00:16:20.839 "nvme_io": false 00:16:20.839 }, 00:16:20.839 "memory_domains": [ 00:16:20.839 { 00:16:20.839 "dma_device_id": "system", 00:16:20.839 "dma_device_type": 1 00:16:20.839 }, 00:16:20.839 { 00:16:20.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.839 "dma_device_type": 2 00:16:20.839 } 00:16:20.839 ], 00:16:20.839 "driver_specific": {} 00:16:20.839 } 00:16:20.839 ] 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.839 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.099 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.099 "name": "Existed_Raid", 00:16:21.099 "uuid": "b800c50a-dd43-4e0c-889f-41842c1d1d7a", 00:16:21.099 "strip_size_kb": 64, 00:16:21.099 "state": "online", 00:16:21.099 "raid_level": "raid0", 00:16:21.099 "superblock": false, 00:16:21.099 "num_base_bdevs": 4, 00:16:21.099 "num_base_bdevs_discovered": 4, 00:16:21.099 "num_base_bdevs_operational": 4, 00:16:21.099 "base_bdevs_list": [ 00:16:21.099 { 00:16:21.099 "name": "NewBaseBdev", 00:16:21.099 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:21.099 "is_configured": true, 00:16:21.099 "data_offset": 0, 00:16:21.099 "data_size": 65536 00:16:21.099 }, 00:16:21.099 { 00:16:21.099 "name": "BaseBdev2", 00:16:21.099 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:21.099 "is_configured": true, 00:16:21.099 "data_offset": 0, 00:16:21.099 "data_size": 65536 00:16:21.099 }, 00:16:21.099 { 00:16:21.099 "name": "BaseBdev3", 00:16:21.099 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:21.099 "is_configured": true, 00:16:21.099 "data_offset": 0, 00:16:21.099 "data_size": 65536 00:16:21.099 }, 00:16:21.099 { 00:16:21.099 "name": "BaseBdev4", 00:16:21.099 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:21.099 "is_configured": true, 00:16:21.099 "data_offset": 0, 00:16:21.099 "data_size": 65536 00:16:21.099 } 00:16:21.099 ] 00:16:21.099 }' 00:16:21.099 10:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.100 10:11:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:21.672 [2024-06-10 10:11:43.505356] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:21.672 "name": "Existed_Raid", 00:16:21.672 "aliases": [ 00:16:21.672 "b800c50a-dd43-4e0c-889f-41842c1d1d7a" 00:16:21.672 ], 00:16:21.672 "product_name": "Raid Volume", 00:16:21.672 "block_size": 512, 00:16:21.672 "num_blocks": 262144, 00:16:21.672 "uuid": "b800c50a-dd43-4e0c-889f-41842c1d1d7a", 00:16:21.672 "assigned_rate_limits": { 00:16:21.672 "rw_ios_per_sec": 0, 00:16:21.672 "rw_mbytes_per_sec": 0, 00:16:21.672 "r_mbytes_per_sec": 0, 00:16:21.672 "w_mbytes_per_sec": 0 00:16:21.672 }, 00:16:21.672 "claimed": false, 00:16:21.672 "zoned": false, 00:16:21.672 "supported_io_types": { 00:16:21.672 "read": true, 00:16:21.672 "write": true, 00:16:21.672 "unmap": true, 00:16:21.672 "write_zeroes": true, 00:16:21.672 "flush": true, 00:16:21.672 "reset": true, 00:16:21.672 "compare": false, 00:16:21.672 "compare_and_write": false, 00:16:21.672 "abort": false, 00:16:21.672 "nvme_admin": false, 00:16:21.672 "nvme_io": false 00:16:21.672 }, 00:16:21.672 "memory_domains": [ 00:16:21.672 { 00:16:21.672 "dma_device_id": "system", 00:16:21.672 "dma_device_type": 1 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.672 "dma_device_type": 2 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "system", 00:16:21.672 "dma_device_type": 1 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.672 "dma_device_type": 2 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "system", 00:16:21.672 "dma_device_type": 1 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.672 "dma_device_type": 2 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "system", 00:16:21.672 "dma_device_type": 1 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.672 "dma_device_type": 2 00:16:21.672 } 00:16:21.672 ], 00:16:21.672 "driver_specific": { 00:16:21.672 "raid": { 00:16:21.672 "uuid": "b800c50a-dd43-4e0c-889f-41842c1d1d7a", 00:16:21.672 "strip_size_kb": 64, 00:16:21.672 "state": "online", 00:16:21.672 "raid_level": "raid0", 00:16:21.672 "superblock": false, 00:16:21.672 "num_base_bdevs": 4, 00:16:21.672 "num_base_bdevs_discovered": 4, 00:16:21.672 "num_base_bdevs_operational": 4, 00:16:21.672 "base_bdevs_list": [ 00:16:21.672 { 00:16:21.672 "name": "NewBaseBdev", 00:16:21.672 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:21.672 "is_configured": true, 00:16:21.672 "data_offset": 0, 00:16:21.672 "data_size": 65536 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "name": "BaseBdev2", 00:16:21.672 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:21.672 "is_configured": true, 00:16:21.672 "data_offset": 0, 00:16:21.672 "data_size": 65536 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "name": "BaseBdev3", 00:16:21.672 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:21.672 "is_configured": true, 00:16:21.672 "data_offset": 0, 00:16:21.672 "data_size": 65536 00:16:21.672 }, 00:16:21.672 { 00:16:21.672 "name": "BaseBdev4", 00:16:21.672 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:21.672 "is_configured": true, 00:16:21.672 "data_offset": 0, 00:16:21.672 "data_size": 65536 00:16:21.672 } 00:16:21.672 ] 00:16:21.672 } 00:16:21.672 } 00:16:21.672 }' 00:16:21.672 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:21.933 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:21.933 BaseBdev2 00:16:21.933 BaseBdev3 00:16:21.933 BaseBdev4' 00:16:21.933 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:21.933 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:21.933 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.933 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.933 "name": "NewBaseBdev", 00:16:21.933 "aliases": [ 00:16:21.933 "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7" 00:16:21.933 ], 00:16:21.933 "product_name": "Malloc disk", 00:16:21.933 "block_size": 512, 00:16:21.933 "num_blocks": 65536, 00:16:21.933 "uuid": "7b4bf6fb-b490-4ad2-ab90-778f9dcdfce7", 00:16:21.933 "assigned_rate_limits": { 00:16:21.933 "rw_ios_per_sec": 0, 00:16:21.933 "rw_mbytes_per_sec": 0, 00:16:21.933 "r_mbytes_per_sec": 0, 00:16:21.933 "w_mbytes_per_sec": 0 00:16:21.933 }, 00:16:21.933 "claimed": true, 00:16:21.933 "claim_type": "exclusive_write", 00:16:21.933 "zoned": false, 00:16:21.933 "supported_io_types": { 00:16:21.933 "read": true, 00:16:21.933 "write": true, 00:16:21.933 "unmap": true, 00:16:21.933 "write_zeroes": true, 00:16:21.933 "flush": true, 00:16:21.933 "reset": true, 00:16:21.933 "compare": false, 00:16:21.933 "compare_and_write": false, 00:16:21.933 "abort": true, 00:16:21.933 "nvme_admin": false, 00:16:21.933 "nvme_io": false 00:16:21.933 }, 00:16:21.933 "memory_domains": [ 00:16:21.933 { 00:16:21.934 "dma_device_id": "system", 00:16:21.934 "dma_device_type": 1 00:16:21.934 }, 00:16:21.934 { 00:16:21.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.934 "dma_device_type": 2 00:16:21.934 } 00:16:21.934 ], 00:16:21.934 "driver_specific": {} 00:16:21.934 }' 00:16:21.934 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.194 10:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.194 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.194 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.456 "name": "BaseBdev2", 00:16:22.456 "aliases": [ 00:16:22.456 "9663027b-2664-4a3f-a3e8-c26d49f02365" 00:16:22.456 ], 00:16:22.456 "product_name": "Malloc disk", 00:16:22.456 "block_size": 512, 00:16:22.456 "num_blocks": 65536, 00:16:22.456 "uuid": "9663027b-2664-4a3f-a3e8-c26d49f02365", 00:16:22.456 "assigned_rate_limits": { 00:16:22.456 "rw_ios_per_sec": 0, 00:16:22.456 "rw_mbytes_per_sec": 0, 00:16:22.456 "r_mbytes_per_sec": 0, 00:16:22.456 "w_mbytes_per_sec": 0 00:16:22.456 }, 00:16:22.456 "claimed": true, 00:16:22.456 "claim_type": "exclusive_write", 00:16:22.456 "zoned": false, 00:16:22.456 "supported_io_types": { 00:16:22.456 "read": true, 00:16:22.456 "write": true, 00:16:22.456 "unmap": true, 00:16:22.456 "write_zeroes": true, 00:16:22.456 "flush": true, 00:16:22.456 "reset": true, 00:16:22.456 "compare": false, 00:16:22.456 "compare_and_write": false, 00:16:22.456 "abort": true, 00:16:22.456 "nvme_admin": false, 00:16:22.456 "nvme_io": false 00:16:22.456 }, 00:16:22.456 "memory_domains": [ 00:16:22.456 { 00:16:22.456 "dma_device_id": "system", 00:16:22.456 "dma_device_type": 1 00:16:22.456 }, 00:16:22.456 { 00:16:22.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.456 "dma_device_type": 2 00:16:22.456 } 00:16:22.456 ], 00:16:22.456 "driver_specific": {} 00:16:22.456 }' 00:16:22.456 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.716 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.716 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.716 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.716 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.717 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.717 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.717 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:22.977 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.238 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.238 "name": "BaseBdev3", 00:16:23.238 "aliases": [ 00:16:23.238 "aac91ac1-263e-4766-a68d-b313228a0398" 00:16:23.238 ], 00:16:23.238 "product_name": "Malloc disk", 00:16:23.238 "block_size": 512, 00:16:23.238 "num_blocks": 65536, 00:16:23.238 "uuid": "aac91ac1-263e-4766-a68d-b313228a0398", 00:16:23.238 "assigned_rate_limits": { 00:16:23.238 "rw_ios_per_sec": 0, 00:16:23.238 "rw_mbytes_per_sec": 0, 00:16:23.238 "r_mbytes_per_sec": 0, 00:16:23.238 "w_mbytes_per_sec": 0 00:16:23.238 }, 00:16:23.238 "claimed": true, 00:16:23.238 "claim_type": "exclusive_write", 00:16:23.238 "zoned": false, 00:16:23.238 "supported_io_types": { 00:16:23.238 "read": true, 00:16:23.238 "write": true, 00:16:23.238 "unmap": true, 00:16:23.238 "write_zeroes": true, 00:16:23.238 "flush": true, 00:16:23.238 "reset": true, 00:16:23.238 "compare": false, 00:16:23.238 "compare_and_write": false, 00:16:23.238 "abort": true, 00:16:23.238 "nvme_admin": false, 00:16:23.238 "nvme_io": false 00:16:23.238 }, 00:16:23.238 "memory_domains": [ 00:16:23.238 { 00:16:23.238 "dma_device_id": "system", 00:16:23.238 "dma_device_type": 1 00:16:23.238 }, 00:16:23.238 { 00:16:23.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.238 "dma_device_type": 2 00:16:23.238 } 00:16:23.238 ], 00:16:23.238 "driver_specific": {} 00:16:23.238 }' 00:16:23.238 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.238 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.238 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.238 10:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.238 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.238 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.238 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.238 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:23.500 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.761 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.761 "name": "BaseBdev4", 00:16:23.761 "aliases": [ 00:16:23.761 "23dd3227-1cac-4ba5-be78-fead955ae5ce" 00:16:23.761 ], 00:16:23.761 "product_name": "Malloc disk", 00:16:23.761 "block_size": 512, 00:16:23.761 "num_blocks": 65536, 00:16:23.761 "uuid": "23dd3227-1cac-4ba5-be78-fead955ae5ce", 00:16:23.761 "assigned_rate_limits": { 00:16:23.761 "rw_ios_per_sec": 0, 00:16:23.761 "rw_mbytes_per_sec": 0, 00:16:23.761 "r_mbytes_per_sec": 0, 00:16:23.761 "w_mbytes_per_sec": 0 00:16:23.761 }, 00:16:23.761 "claimed": true, 00:16:23.761 "claim_type": "exclusive_write", 00:16:23.761 "zoned": false, 00:16:23.761 "supported_io_types": { 00:16:23.761 "read": true, 00:16:23.761 "write": true, 00:16:23.761 "unmap": true, 00:16:23.761 "write_zeroes": true, 00:16:23.761 "flush": true, 00:16:23.761 "reset": true, 00:16:23.761 "compare": false, 00:16:23.761 "compare_and_write": false, 00:16:23.761 "abort": true, 00:16:23.761 "nvme_admin": false, 00:16:23.761 "nvme_io": false 00:16:23.761 }, 00:16:23.761 "memory_domains": [ 00:16:23.761 { 00:16:23.761 "dma_device_id": "system", 00:16:23.762 "dma_device_type": 1 00:16:23.762 }, 00:16:23.762 { 00:16:23.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.762 "dma_device_type": 2 00:16:23.762 } 00:16:23.762 ], 00:16:23.762 "driver_specific": {} 00:16:23.762 }' 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.762 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.022 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.023 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.023 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.023 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.023 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.023 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:24.283 [2024-06-10 10:11:45.943441] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:24.283 [2024-06-10 10:11:45.943458] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:24.283 [2024-06-10 10:11:45.943493] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:24.283 [2024-06-10 10:11:45.943538] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:24.283 [2024-06-10 10:11:45.943544] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f7490 name Existed_Raid, state offline 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1021248 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1021248 ']' 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1021248 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:24.283 10:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1021248 00:16:24.283 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:24.283 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:24.283 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1021248' 00:16:24.283 killing process with pid 1021248 00:16:24.283 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1021248 00:16:24.283 [2024-06-10 10:11:46.011234] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:24.283 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1021248 00:16:24.283 [2024-06-10 10:11:46.031641] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:24.544 00:16:24.544 real 0m26.864s 00:16:24.544 user 0m50.417s 00:16:24.544 sys 0m3.895s 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.544 ************************************ 00:16:24.544 END TEST raid_state_function_test 00:16:24.544 ************************************ 00:16:24.544 10:11:46 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:16:24.544 10:11:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:24.544 10:11:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:24.544 10:11:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:24.544 ************************************ 00:16:24.544 START TEST raid_state_function_test_sb 00:16:24.544 ************************************ 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 true 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:24.544 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1026437 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1026437' 00:16:24.545 Process raid pid: 1026437 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1026437 /var/tmp/spdk-raid.sock 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1026437 ']' 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:24.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:24.545 10:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:24.545 [2024-06-10 10:11:46.292242] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:16:24.545 [2024-06-10 10:11:46.292293] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:24.545 [2024-06-10 10:11:46.383327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:24.805 [2024-06-10 10:11:46.448001] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:24.805 [2024-06-10 10:11:46.489633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:24.805 [2024-06-10 10:11:46.489654] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:25.375 10:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:25.375 10:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:16:25.375 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:25.635 [2024-06-10 10:11:47.312987] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:25.635 [2024-06-10 10:11:47.313022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:25.635 [2024-06-10 10:11:47.313028] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:25.635 [2024-06-10 10:11:47.313034] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:25.635 [2024-06-10 10:11:47.313041] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:25.635 [2024-06-10 10:11:47.313047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:25.635 [2024-06-10 10:11:47.313051] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:25.635 [2024-06-10 10:11:47.313057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.635 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.895 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.895 "name": "Existed_Raid", 00:16:25.895 "uuid": "29850ea5-9444-41e8-98ac-2d127af0a287", 00:16:25.896 "strip_size_kb": 64, 00:16:25.896 "state": "configuring", 00:16:25.896 "raid_level": "raid0", 00:16:25.896 "superblock": true, 00:16:25.896 "num_base_bdevs": 4, 00:16:25.896 "num_base_bdevs_discovered": 0, 00:16:25.896 "num_base_bdevs_operational": 4, 00:16:25.896 "base_bdevs_list": [ 00:16:25.896 { 00:16:25.896 "name": "BaseBdev1", 00:16:25.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.896 "is_configured": false, 00:16:25.896 "data_offset": 0, 00:16:25.896 "data_size": 0 00:16:25.896 }, 00:16:25.896 { 00:16:25.896 "name": "BaseBdev2", 00:16:25.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.896 "is_configured": false, 00:16:25.896 "data_offset": 0, 00:16:25.896 "data_size": 0 00:16:25.896 }, 00:16:25.896 { 00:16:25.896 "name": "BaseBdev3", 00:16:25.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.896 "is_configured": false, 00:16:25.896 "data_offset": 0, 00:16:25.896 "data_size": 0 00:16:25.896 }, 00:16:25.896 { 00:16:25.896 "name": "BaseBdev4", 00:16:25.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.896 "is_configured": false, 00:16:25.896 "data_offset": 0, 00:16:25.896 "data_size": 0 00:16:25.896 } 00:16:25.896 ] 00:16:25.896 }' 00:16:25.896 10:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.896 10:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.465 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:26.465 [2024-06-10 10:11:48.195101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:26.465 [2024-06-10 10:11:48.195117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c30b20 name Existed_Raid, state configuring 00:16:26.465 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:26.725 [2024-06-10 10:11:48.343506] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:26.725 [2024-06-10 10:11:48.343524] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:26.725 [2024-06-10 10:11:48.343530] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:26.725 [2024-06-10 10:11:48.343535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:26.725 [2024-06-10 10:11:48.343540] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:26.725 [2024-06-10 10:11:48.343545] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:26.725 [2024-06-10 10:11:48.343550] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:26.725 [2024-06-10 10:11:48.343560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:26.725 [2024-06-10 10:11:48.494311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:26.725 BaseBdev1 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:26.725 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:26.985 [ 00:16:26.985 { 00:16:26.985 "name": "BaseBdev1", 00:16:26.985 "aliases": [ 00:16:26.985 "606d2a07-934a-40ea-b4c6-1851e4d039c7" 00:16:26.985 ], 00:16:26.985 "product_name": "Malloc disk", 00:16:26.985 "block_size": 512, 00:16:26.985 "num_blocks": 65536, 00:16:26.985 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:26.985 "assigned_rate_limits": { 00:16:26.985 "rw_ios_per_sec": 0, 00:16:26.985 "rw_mbytes_per_sec": 0, 00:16:26.985 "r_mbytes_per_sec": 0, 00:16:26.985 "w_mbytes_per_sec": 0 00:16:26.985 }, 00:16:26.985 "claimed": true, 00:16:26.985 "claim_type": "exclusive_write", 00:16:26.985 "zoned": false, 00:16:26.985 "supported_io_types": { 00:16:26.985 "read": true, 00:16:26.985 "write": true, 00:16:26.985 "unmap": true, 00:16:26.985 "write_zeroes": true, 00:16:26.985 "flush": true, 00:16:26.985 "reset": true, 00:16:26.985 "compare": false, 00:16:26.985 "compare_and_write": false, 00:16:26.985 "abort": true, 00:16:26.985 "nvme_admin": false, 00:16:26.985 "nvme_io": false 00:16:26.985 }, 00:16:26.985 "memory_domains": [ 00:16:26.985 { 00:16:26.985 "dma_device_id": "system", 00:16:26.985 "dma_device_type": 1 00:16:26.985 }, 00:16:26.985 { 00:16:26.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.985 "dma_device_type": 2 00:16:26.985 } 00:16:26.985 ], 00:16:26.985 "driver_specific": {} 00:16:26.985 } 00:16:26.985 ] 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.985 10:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.245 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.245 "name": "Existed_Raid", 00:16:27.245 "uuid": "1cb03f97-dde3-40f9-8237-5bdc5a2c156e", 00:16:27.245 "strip_size_kb": 64, 00:16:27.245 "state": "configuring", 00:16:27.245 "raid_level": "raid0", 00:16:27.245 "superblock": true, 00:16:27.245 "num_base_bdevs": 4, 00:16:27.245 "num_base_bdevs_discovered": 1, 00:16:27.245 "num_base_bdevs_operational": 4, 00:16:27.245 "base_bdevs_list": [ 00:16:27.245 { 00:16:27.245 "name": "BaseBdev1", 00:16:27.245 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:27.245 "is_configured": true, 00:16:27.245 "data_offset": 2048, 00:16:27.245 "data_size": 63488 00:16:27.245 }, 00:16:27.245 { 00:16:27.245 "name": "BaseBdev2", 00:16:27.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.245 "is_configured": false, 00:16:27.245 "data_offset": 0, 00:16:27.245 "data_size": 0 00:16:27.245 }, 00:16:27.245 { 00:16:27.245 "name": "BaseBdev3", 00:16:27.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.245 "is_configured": false, 00:16:27.245 "data_offset": 0, 00:16:27.245 "data_size": 0 00:16:27.245 }, 00:16:27.245 { 00:16:27.245 "name": "BaseBdev4", 00:16:27.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.245 "is_configured": false, 00:16:27.245 "data_offset": 0, 00:16:27.245 "data_size": 0 00:16:27.245 } 00:16:27.245 ] 00:16:27.245 }' 00:16:27.245 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.245 10:11:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:27.815 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:28.076 [2024-06-10 10:11:49.733437] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:28.076 [2024-06-10 10:11:49.733465] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c303b0 name Existed_Raid, state configuring 00:16:28.076 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:28.076 [2024-06-10 10:11:49.925959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:28.076 [2024-06-10 10:11:49.927097] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:28.076 [2024-06-10 10:11:49.927121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:28.076 [2024-06-10 10:11:49.927127] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:28.076 [2024-06-10 10:11:49.927132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:28.076 [2024-06-10 10:11:49.927137] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:28.076 [2024-06-10 10:11:49.927143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.336 10:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.336 10:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.336 "name": "Existed_Raid", 00:16:28.336 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:28.336 "strip_size_kb": 64, 00:16:28.336 "state": "configuring", 00:16:28.336 "raid_level": "raid0", 00:16:28.336 "superblock": true, 00:16:28.336 "num_base_bdevs": 4, 00:16:28.336 "num_base_bdevs_discovered": 1, 00:16:28.336 "num_base_bdevs_operational": 4, 00:16:28.336 "base_bdevs_list": [ 00:16:28.336 { 00:16:28.336 "name": "BaseBdev1", 00:16:28.336 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:28.336 "is_configured": true, 00:16:28.336 "data_offset": 2048, 00:16:28.336 "data_size": 63488 00:16:28.336 }, 00:16:28.336 { 00:16:28.336 "name": "BaseBdev2", 00:16:28.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.336 "is_configured": false, 00:16:28.336 "data_offset": 0, 00:16:28.336 "data_size": 0 00:16:28.336 }, 00:16:28.336 { 00:16:28.336 "name": "BaseBdev3", 00:16:28.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.336 "is_configured": false, 00:16:28.336 "data_offset": 0, 00:16:28.336 "data_size": 0 00:16:28.336 }, 00:16:28.336 { 00:16:28.336 "name": "BaseBdev4", 00:16:28.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.336 "is_configured": false, 00:16:28.336 "data_offset": 0, 00:16:28.336 "data_size": 0 00:16:28.336 } 00:16:28.336 ] 00:16:28.336 }' 00:16:28.336 10:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.336 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.906 10:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:29.166 [2024-06-10 10:11:50.869147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:29.166 BaseBdev2 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:29.166 10:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:29.427 [ 00:16:29.427 { 00:16:29.427 "name": "BaseBdev2", 00:16:29.427 "aliases": [ 00:16:29.427 "43826a7d-3d6a-45f5-b0e3-69c277f37d3d" 00:16:29.427 ], 00:16:29.427 "product_name": "Malloc disk", 00:16:29.427 "block_size": 512, 00:16:29.427 "num_blocks": 65536, 00:16:29.427 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:29.427 "assigned_rate_limits": { 00:16:29.427 "rw_ios_per_sec": 0, 00:16:29.427 "rw_mbytes_per_sec": 0, 00:16:29.427 "r_mbytes_per_sec": 0, 00:16:29.427 "w_mbytes_per_sec": 0 00:16:29.427 }, 00:16:29.427 "claimed": true, 00:16:29.427 "claim_type": "exclusive_write", 00:16:29.427 "zoned": false, 00:16:29.427 "supported_io_types": { 00:16:29.427 "read": true, 00:16:29.427 "write": true, 00:16:29.427 "unmap": true, 00:16:29.427 "write_zeroes": true, 00:16:29.427 "flush": true, 00:16:29.427 "reset": true, 00:16:29.427 "compare": false, 00:16:29.427 "compare_and_write": false, 00:16:29.427 "abort": true, 00:16:29.427 "nvme_admin": false, 00:16:29.427 "nvme_io": false 00:16:29.427 }, 00:16:29.427 "memory_domains": [ 00:16:29.427 { 00:16:29.427 "dma_device_id": "system", 00:16:29.427 "dma_device_type": 1 00:16:29.427 }, 00:16:29.427 { 00:16:29.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.427 "dma_device_type": 2 00:16:29.427 } 00:16:29.427 ], 00:16:29.427 "driver_specific": {} 00:16:29.427 } 00:16:29.427 ] 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.427 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.688 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.688 "name": "Existed_Raid", 00:16:29.688 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:29.688 "strip_size_kb": 64, 00:16:29.688 "state": "configuring", 00:16:29.688 "raid_level": "raid0", 00:16:29.688 "superblock": true, 00:16:29.688 "num_base_bdevs": 4, 00:16:29.688 "num_base_bdevs_discovered": 2, 00:16:29.688 "num_base_bdevs_operational": 4, 00:16:29.688 "base_bdevs_list": [ 00:16:29.688 { 00:16:29.688 "name": "BaseBdev1", 00:16:29.688 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:29.688 "is_configured": true, 00:16:29.688 "data_offset": 2048, 00:16:29.688 "data_size": 63488 00:16:29.688 }, 00:16:29.688 { 00:16:29.688 "name": "BaseBdev2", 00:16:29.688 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:29.688 "is_configured": true, 00:16:29.688 "data_offset": 2048, 00:16:29.688 "data_size": 63488 00:16:29.688 }, 00:16:29.688 { 00:16:29.688 "name": "BaseBdev3", 00:16:29.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.688 "is_configured": false, 00:16:29.688 "data_offset": 0, 00:16:29.688 "data_size": 0 00:16:29.688 }, 00:16:29.688 { 00:16:29.688 "name": "BaseBdev4", 00:16:29.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.688 "is_configured": false, 00:16:29.688 "data_offset": 0, 00:16:29.688 "data_size": 0 00:16:29.688 } 00:16:29.688 ] 00:16:29.688 }' 00:16:29.688 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.688 10:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.259 10:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:30.259 [2024-06-10 10:11:52.125280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:30.521 BaseBdev3 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.521 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:30.809 [ 00:16:30.809 { 00:16:30.809 "name": "BaseBdev3", 00:16:30.809 "aliases": [ 00:16:30.809 "02c25991-d86b-412a-8a95-a964a9ed52a6" 00:16:30.809 ], 00:16:30.809 "product_name": "Malloc disk", 00:16:30.809 "block_size": 512, 00:16:30.809 "num_blocks": 65536, 00:16:30.809 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:30.809 "assigned_rate_limits": { 00:16:30.809 "rw_ios_per_sec": 0, 00:16:30.809 "rw_mbytes_per_sec": 0, 00:16:30.809 "r_mbytes_per_sec": 0, 00:16:30.809 "w_mbytes_per_sec": 0 00:16:30.809 }, 00:16:30.809 "claimed": true, 00:16:30.809 "claim_type": "exclusive_write", 00:16:30.809 "zoned": false, 00:16:30.809 "supported_io_types": { 00:16:30.809 "read": true, 00:16:30.809 "write": true, 00:16:30.809 "unmap": true, 00:16:30.809 "write_zeroes": true, 00:16:30.809 "flush": true, 00:16:30.809 "reset": true, 00:16:30.809 "compare": false, 00:16:30.809 "compare_and_write": false, 00:16:30.809 "abort": true, 00:16:30.809 "nvme_admin": false, 00:16:30.809 "nvme_io": false 00:16:30.809 }, 00:16:30.809 "memory_domains": [ 00:16:30.809 { 00:16:30.809 "dma_device_id": "system", 00:16:30.809 "dma_device_type": 1 00:16:30.809 }, 00:16:30.809 { 00:16:30.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.809 "dma_device_type": 2 00:16:30.809 } 00:16:30.809 ], 00:16:30.809 "driver_specific": {} 00:16:30.809 } 00:16:30.809 ] 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.809 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.073 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.073 "name": "Existed_Raid", 00:16:31.073 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:31.073 "strip_size_kb": 64, 00:16:31.073 "state": "configuring", 00:16:31.073 "raid_level": "raid0", 00:16:31.073 "superblock": true, 00:16:31.073 "num_base_bdevs": 4, 00:16:31.073 "num_base_bdevs_discovered": 3, 00:16:31.073 "num_base_bdevs_operational": 4, 00:16:31.073 "base_bdevs_list": [ 00:16:31.073 { 00:16:31.073 "name": "BaseBdev1", 00:16:31.073 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:31.073 "is_configured": true, 00:16:31.073 "data_offset": 2048, 00:16:31.073 "data_size": 63488 00:16:31.073 }, 00:16:31.073 { 00:16:31.073 "name": "BaseBdev2", 00:16:31.073 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:31.073 "is_configured": true, 00:16:31.073 "data_offset": 2048, 00:16:31.073 "data_size": 63488 00:16:31.073 }, 00:16:31.073 { 00:16:31.073 "name": "BaseBdev3", 00:16:31.073 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:31.073 "is_configured": true, 00:16:31.073 "data_offset": 2048, 00:16:31.073 "data_size": 63488 00:16:31.073 }, 00:16:31.073 { 00:16:31.073 "name": "BaseBdev4", 00:16:31.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.073 "is_configured": false, 00:16:31.073 "data_offset": 0, 00:16:31.073 "data_size": 0 00:16:31.073 } 00:16:31.073 ] 00:16:31.073 }' 00:16:31.073 10:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.073 10:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.643 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:31.643 [2024-06-10 10:11:53.397202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:31.643 [2024-06-10 10:11:53.397331] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c314c0 00:16:31.643 [2024-06-10 10:11:53.397340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:31.643 [2024-06-10 10:11:53.397479] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1de2820 00:16:31.643 [2024-06-10 10:11:53.397567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c314c0 00:16:31.643 [2024-06-10 10:11:53.397572] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c314c0 00:16:31.643 [2024-06-10 10:11:53.397640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:31.643 BaseBdev4 00:16:31.643 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:31.643 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:31.643 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:31.644 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:31.644 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:31.644 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:31.644 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:31.904 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:32.165 [ 00:16:32.165 { 00:16:32.165 "name": "BaseBdev4", 00:16:32.165 "aliases": [ 00:16:32.165 "2bbbde77-4d8e-49c9-b579-59bd9b6d758b" 00:16:32.165 ], 00:16:32.165 "product_name": "Malloc disk", 00:16:32.165 "block_size": 512, 00:16:32.165 "num_blocks": 65536, 00:16:32.165 "uuid": "2bbbde77-4d8e-49c9-b579-59bd9b6d758b", 00:16:32.165 "assigned_rate_limits": { 00:16:32.165 "rw_ios_per_sec": 0, 00:16:32.165 "rw_mbytes_per_sec": 0, 00:16:32.165 "r_mbytes_per_sec": 0, 00:16:32.165 "w_mbytes_per_sec": 0 00:16:32.165 }, 00:16:32.165 "claimed": true, 00:16:32.165 "claim_type": "exclusive_write", 00:16:32.165 "zoned": false, 00:16:32.165 "supported_io_types": { 00:16:32.165 "read": true, 00:16:32.165 "write": true, 00:16:32.165 "unmap": true, 00:16:32.165 "write_zeroes": true, 00:16:32.165 "flush": true, 00:16:32.165 "reset": true, 00:16:32.165 "compare": false, 00:16:32.165 "compare_and_write": false, 00:16:32.165 "abort": true, 00:16:32.165 "nvme_admin": false, 00:16:32.165 "nvme_io": false 00:16:32.165 }, 00:16:32.165 "memory_domains": [ 00:16:32.165 { 00:16:32.165 "dma_device_id": "system", 00:16:32.165 "dma_device_type": 1 00:16:32.165 }, 00:16:32.165 { 00:16:32.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.165 "dma_device_type": 2 00:16:32.165 } 00:16:32.165 ], 00:16:32.165 "driver_specific": {} 00:16:32.165 } 00:16:32.165 ] 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.165 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.165 "name": "Existed_Raid", 00:16:32.165 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:32.165 "strip_size_kb": 64, 00:16:32.165 "state": "online", 00:16:32.165 "raid_level": "raid0", 00:16:32.165 "superblock": true, 00:16:32.165 "num_base_bdevs": 4, 00:16:32.165 "num_base_bdevs_discovered": 4, 00:16:32.165 "num_base_bdevs_operational": 4, 00:16:32.165 "base_bdevs_list": [ 00:16:32.165 { 00:16:32.165 "name": "BaseBdev1", 00:16:32.165 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:32.165 "is_configured": true, 00:16:32.165 "data_offset": 2048, 00:16:32.165 "data_size": 63488 00:16:32.165 }, 00:16:32.165 { 00:16:32.165 "name": "BaseBdev2", 00:16:32.165 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:32.165 "is_configured": true, 00:16:32.165 "data_offset": 2048, 00:16:32.165 "data_size": 63488 00:16:32.165 }, 00:16:32.165 { 00:16:32.165 "name": "BaseBdev3", 00:16:32.165 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:32.165 "is_configured": true, 00:16:32.166 "data_offset": 2048, 00:16:32.166 "data_size": 63488 00:16:32.166 }, 00:16:32.166 { 00:16:32.166 "name": "BaseBdev4", 00:16:32.166 "uuid": "2bbbde77-4d8e-49c9-b579-59bd9b6d758b", 00:16:32.166 "is_configured": true, 00:16:32.166 "data_offset": 2048, 00:16:32.166 "data_size": 63488 00:16:32.166 } 00:16:32.166 ] 00:16:32.166 }' 00:16:32.166 10:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.166 10:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:32.737 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:32.998 [2024-06-10 10:11:54.716785] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.998 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:32.998 "name": "Existed_Raid", 00:16:32.998 "aliases": [ 00:16:32.998 "381a1f90-4488-4296-b706-36033ebb2f2f" 00:16:32.998 ], 00:16:32.998 "product_name": "Raid Volume", 00:16:32.998 "block_size": 512, 00:16:32.998 "num_blocks": 253952, 00:16:32.998 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:32.998 "assigned_rate_limits": { 00:16:32.998 "rw_ios_per_sec": 0, 00:16:32.998 "rw_mbytes_per_sec": 0, 00:16:32.998 "r_mbytes_per_sec": 0, 00:16:32.998 "w_mbytes_per_sec": 0 00:16:32.998 }, 00:16:32.998 "claimed": false, 00:16:32.998 "zoned": false, 00:16:32.998 "supported_io_types": { 00:16:32.998 "read": true, 00:16:32.998 "write": true, 00:16:32.998 "unmap": true, 00:16:32.998 "write_zeroes": true, 00:16:32.998 "flush": true, 00:16:32.998 "reset": true, 00:16:32.998 "compare": false, 00:16:32.998 "compare_and_write": false, 00:16:32.998 "abort": false, 00:16:32.998 "nvme_admin": false, 00:16:32.998 "nvme_io": false 00:16:32.998 }, 00:16:32.998 "memory_domains": [ 00:16:32.998 { 00:16:32.998 "dma_device_id": "system", 00:16:32.998 "dma_device_type": 1 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.998 "dma_device_type": 2 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "system", 00:16:32.998 "dma_device_type": 1 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.998 "dma_device_type": 2 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "system", 00:16:32.998 "dma_device_type": 1 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.998 "dma_device_type": 2 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "system", 00:16:32.998 "dma_device_type": 1 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.998 "dma_device_type": 2 00:16:32.998 } 00:16:32.998 ], 00:16:32.998 "driver_specific": { 00:16:32.998 "raid": { 00:16:32.998 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:32.998 "strip_size_kb": 64, 00:16:32.998 "state": "online", 00:16:32.998 "raid_level": "raid0", 00:16:32.998 "superblock": true, 00:16:32.998 "num_base_bdevs": 4, 00:16:32.998 "num_base_bdevs_discovered": 4, 00:16:32.998 "num_base_bdevs_operational": 4, 00:16:32.998 "base_bdevs_list": [ 00:16:32.998 { 00:16:32.998 "name": "BaseBdev1", 00:16:32.998 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:32.998 "is_configured": true, 00:16:32.998 "data_offset": 2048, 00:16:32.998 "data_size": 63488 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "name": "BaseBdev2", 00:16:32.998 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:32.998 "is_configured": true, 00:16:32.998 "data_offset": 2048, 00:16:32.998 "data_size": 63488 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "name": "BaseBdev3", 00:16:32.998 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:32.998 "is_configured": true, 00:16:32.998 "data_offset": 2048, 00:16:32.998 "data_size": 63488 00:16:32.998 }, 00:16:32.998 { 00:16:32.998 "name": "BaseBdev4", 00:16:32.998 "uuid": "2bbbde77-4d8e-49c9-b579-59bd9b6d758b", 00:16:32.998 "is_configured": true, 00:16:32.998 "data_offset": 2048, 00:16:32.998 "data_size": 63488 00:16:32.998 } 00:16:32.999 ] 00:16:32.999 } 00:16:32.999 } 00:16:32.999 }' 00:16:32.999 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:32.999 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:32.999 BaseBdev2 00:16:32.999 BaseBdev3 00:16:32.999 BaseBdev4' 00:16:32.999 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.999 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:32.999 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.259 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.259 "name": "BaseBdev1", 00:16:33.259 "aliases": [ 00:16:33.259 "606d2a07-934a-40ea-b4c6-1851e4d039c7" 00:16:33.259 ], 00:16:33.259 "product_name": "Malloc disk", 00:16:33.259 "block_size": 512, 00:16:33.259 "num_blocks": 65536, 00:16:33.259 "uuid": "606d2a07-934a-40ea-b4c6-1851e4d039c7", 00:16:33.259 "assigned_rate_limits": { 00:16:33.259 "rw_ios_per_sec": 0, 00:16:33.259 "rw_mbytes_per_sec": 0, 00:16:33.259 "r_mbytes_per_sec": 0, 00:16:33.259 "w_mbytes_per_sec": 0 00:16:33.259 }, 00:16:33.259 "claimed": true, 00:16:33.259 "claim_type": "exclusive_write", 00:16:33.259 "zoned": false, 00:16:33.259 "supported_io_types": { 00:16:33.259 "read": true, 00:16:33.259 "write": true, 00:16:33.259 "unmap": true, 00:16:33.259 "write_zeroes": true, 00:16:33.259 "flush": true, 00:16:33.259 "reset": true, 00:16:33.259 "compare": false, 00:16:33.259 "compare_and_write": false, 00:16:33.259 "abort": true, 00:16:33.259 "nvme_admin": false, 00:16:33.259 "nvme_io": false 00:16:33.259 }, 00:16:33.259 "memory_domains": [ 00:16:33.259 { 00:16:33.259 "dma_device_id": "system", 00:16:33.259 "dma_device_type": 1 00:16:33.259 }, 00:16:33.259 { 00:16:33.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.259 "dma_device_type": 2 00:16:33.259 } 00:16:33.259 ], 00:16:33.259 "driver_specific": {} 00:16:33.259 }' 00:16:33.259 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.259 10:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.259 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.259 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.259 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.259 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.259 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:33.520 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.781 "name": "BaseBdev2", 00:16:33.781 "aliases": [ 00:16:33.781 "43826a7d-3d6a-45f5-b0e3-69c277f37d3d" 00:16:33.781 ], 00:16:33.781 "product_name": "Malloc disk", 00:16:33.781 "block_size": 512, 00:16:33.781 "num_blocks": 65536, 00:16:33.781 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:33.781 "assigned_rate_limits": { 00:16:33.781 "rw_ios_per_sec": 0, 00:16:33.781 "rw_mbytes_per_sec": 0, 00:16:33.781 "r_mbytes_per_sec": 0, 00:16:33.781 "w_mbytes_per_sec": 0 00:16:33.781 }, 00:16:33.781 "claimed": true, 00:16:33.781 "claim_type": "exclusive_write", 00:16:33.781 "zoned": false, 00:16:33.781 "supported_io_types": { 00:16:33.781 "read": true, 00:16:33.781 "write": true, 00:16:33.781 "unmap": true, 00:16:33.781 "write_zeroes": true, 00:16:33.781 "flush": true, 00:16:33.781 "reset": true, 00:16:33.781 "compare": false, 00:16:33.781 "compare_and_write": false, 00:16:33.781 "abort": true, 00:16:33.781 "nvme_admin": false, 00:16:33.781 "nvme_io": false 00:16:33.781 }, 00:16:33.781 "memory_domains": [ 00:16:33.781 { 00:16:33.781 "dma_device_id": "system", 00:16:33.781 "dma_device_type": 1 00:16:33.781 }, 00:16:33.781 { 00:16:33.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.781 "dma_device_type": 2 00:16:33.781 } 00:16:33.781 ], 00:16:33.781 "driver_specific": {} 00:16:33.781 }' 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.781 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:34.042 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:34.303 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:34.303 "name": "BaseBdev3", 00:16:34.303 "aliases": [ 00:16:34.303 "02c25991-d86b-412a-8a95-a964a9ed52a6" 00:16:34.303 ], 00:16:34.303 "product_name": "Malloc disk", 00:16:34.303 "block_size": 512, 00:16:34.303 "num_blocks": 65536, 00:16:34.303 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:34.303 "assigned_rate_limits": { 00:16:34.303 "rw_ios_per_sec": 0, 00:16:34.303 "rw_mbytes_per_sec": 0, 00:16:34.303 "r_mbytes_per_sec": 0, 00:16:34.303 "w_mbytes_per_sec": 0 00:16:34.303 }, 00:16:34.303 "claimed": true, 00:16:34.303 "claim_type": "exclusive_write", 00:16:34.303 "zoned": false, 00:16:34.303 "supported_io_types": { 00:16:34.303 "read": true, 00:16:34.303 "write": true, 00:16:34.303 "unmap": true, 00:16:34.303 "write_zeroes": true, 00:16:34.303 "flush": true, 00:16:34.303 "reset": true, 00:16:34.303 "compare": false, 00:16:34.303 "compare_and_write": false, 00:16:34.303 "abort": true, 00:16:34.303 "nvme_admin": false, 00:16:34.303 "nvme_io": false 00:16:34.303 }, 00:16:34.303 "memory_domains": [ 00:16:34.303 { 00:16:34.303 "dma_device_id": "system", 00:16:34.303 "dma_device_type": 1 00:16:34.303 }, 00:16:34.303 { 00:16:34.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.303 "dma_device_type": 2 00:16:34.303 } 00:16:34.303 ], 00:16:34.303 "driver_specific": {} 00:16:34.303 }' 00:16:34.303 10:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.303 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:34.564 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:34.824 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:34.824 "name": "BaseBdev4", 00:16:34.824 "aliases": [ 00:16:34.824 "2bbbde77-4d8e-49c9-b579-59bd9b6d758b" 00:16:34.824 ], 00:16:34.824 "product_name": "Malloc disk", 00:16:34.824 "block_size": 512, 00:16:34.824 "num_blocks": 65536, 00:16:34.824 "uuid": "2bbbde77-4d8e-49c9-b579-59bd9b6d758b", 00:16:34.824 "assigned_rate_limits": { 00:16:34.824 "rw_ios_per_sec": 0, 00:16:34.824 "rw_mbytes_per_sec": 0, 00:16:34.824 "r_mbytes_per_sec": 0, 00:16:34.824 "w_mbytes_per_sec": 0 00:16:34.824 }, 00:16:34.824 "claimed": true, 00:16:34.824 "claim_type": "exclusive_write", 00:16:34.824 "zoned": false, 00:16:34.824 "supported_io_types": { 00:16:34.824 "read": true, 00:16:34.824 "write": true, 00:16:34.824 "unmap": true, 00:16:34.824 "write_zeroes": true, 00:16:34.824 "flush": true, 00:16:34.824 "reset": true, 00:16:34.824 "compare": false, 00:16:34.824 "compare_and_write": false, 00:16:34.824 "abort": true, 00:16:34.824 "nvme_admin": false, 00:16:34.824 "nvme_io": false 00:16:34.824 }, 00:16:34.824 "memory_domains": [ 00:16:34.824 { 00:16:34.824 "dma_device_id": "system", 00:16:34.824 "dma_device_type": 1 00:16:34.824 }, 00:16:34.824 { 00:16:34.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.824 "dma_device_type": 2 00:16:34.824 } 00:16:34.824 ], 00:16:34.825 "driver_specific": {} 00:16:34.825 }' 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.825 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:35.086 [2024-06-10 10:11:56.898098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:35.086 [2024-06-10 10:11:56.898115] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:35.086 [2024-06-10 10:11:56.898149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.086 10:11:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.347 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.347 "name": "Existed_Raid", 00:16:35.347 "uuid": "381a1f90-4488-4296-b706-36033ebb2f2f", 00:16:35.347 "strip_size_kb": 64, 00:16:35.347 "state": "offline", 00:16:35.347 "raid_level": "raid0", 00:16:35.347 "superblock": true, 00:16:35.347 "num_base_bdevs": 4, 00:16:35.347 "num_base_bdevs_discovered": 3, 00:16:35.347 "num_base_bdevs_operational": 3, 00:16:35.347 "base_bdevs_list": [ 00:16:35.347 { 00:16:35.347 "name": null, 00:16:35.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.347 "is_configured": false, 00:16:35.347 "data_offset": 2048, 00:16:35.347 "data_size": 63488 00:16:35.347 }, 00:16:35.347 { 00:16:35.347 "name": "BaseBdev2", 00:16:35.347 "uuid": "43826a7d-3d6a-45f5-b0e3-69c277f37d3d", 00:16:35.347 "is_configured": true, 00:16:35.347 "data_offset": 2048, 00:16:35.347 "data_size": 63488 00:16:35.347 }, 00:16:35.347 { 00:16:35.347 "name": "BaseBdev3", 00:16:35.347 "uuid": "02c25991-d86b-412a-8a95-a964a9ed52a6", 00:16:35.347 "is_configured": true, 00:16:35.347 "data_offset": 2048, 00:16:35.347 "data_size": 63488 00:16:35.347 }, 00:16:35.347 { 00:16:35.347 "name": "BaseBdev4", 00:16:35.347 "uuid": "2bbbde77-4d8e-49c9-b579-59bd9b6d758b", 00:16:35.347 "is_configured": true, 00:16:35.347 "data_offset": 2048, 00:16:35.347 "data_size": 63488 00:16:35.347 } 00:16:35.347 ] 00:16:35.347 }' 00:16:35.347 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.347 10:11:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.919 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:35.919 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:35.919 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.919 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:36.246 [2024-06-10 10:11:57.968802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.246 10:11:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:36.508 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:36.508 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:36.508 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:36.509 [2024-06-10 10:11:58.303461] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:36.509 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:36.509 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:36.509 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.509 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:36.769 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:36.769 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:36.769 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:36.769 [2024-06-10 10:11:58.626141] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:36.769 [2024-06-10 10:11:58.626169] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c314c0 name Existed_Raid, state offline 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:37.031 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:37.292 BaseBdev2 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:37.292 10:11:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.292 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:37.553 [ 00:16:37.553 { 00:16:37.553 "name": "BaseBdev2", 00:16:37.553 "aliases": [ 00:16:37.553 "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c" 00:16:37.553 ], 00:16:37.553 "product_name": "Malloc disk", 00:16:37.553 "block_size": 512, 00:16:37.553 "num_blocks": 65536, 00:16:37.553 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:37.553 "assigned_rate_limits": { 00:16:37.553 "rw_ios_per_sec": 0, 00:16:37.553 "rw_mbytes_per_sec": 0, 00:16:37.553 "r_mbytes_per_sec": 0, 00:16:37.553 "w_mbytes_per_sec": 0 00:16:37.553 }, 00:16:37.553 "claimed": false, 00:16:37.553 "zoned": false, 00:16:37.553 "supported_io_types": { 00:16:37.553 "read": true, 00:16:37.553 "write": true, 00:16:37.553 "unmap": true, 00:16:37.553 "write_zeroes": true, 00:16:37.553 "flush": true, 00:16:37.553 "reset": true, 00:16:37.553 "compare": false, 00:16:37.553 "compare_and_write": false, 00:16:37.553 "abort": true, 00:16:37.553 "nvme_admin": false, 00:16:37.553 "nvme_io": false 00:16:37.553 }, 00:16:37.553 "memory_domains": [ 00:16:37.553 { 00:16:37.553 "dma_device_id": "system", 00:16:37.553 "dma_device_type": 1 00:16:37.553 }, 00:16:37.553 { 00:16:37.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.553 "dma_device_type": 2 00:16:37.553 } 00:16:37.553 ], 00:16:37.553 "driver_specific": {} 00:16:37.553 } 00:16:37.553 ] 00:16:37.553 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:37.553 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:37.553 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:37.553 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:37.815 BaseBdev3 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.815 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:38.075 [ 00:16:38.075 { 00:16:38.075 "name": "BaseBdev3", 00:16:38.075 "aliases": [ 00:16:38.075 "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310" 00:16:38.075 ], 00:16:38.075 "product_name": "Malloc disk", 00:16:38.075 "block_size": 512, 00:16:38.075 "num_blocks": 65536, 00:16:38.075 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:38.075 "assigned_rate_limits": { 00:16:38.075 "rw_ios_per_sec": 0, 00:16:38.075 "rw_mbytes_per_sec": 0, 00:16:38.075 "r_mbytes_per_sec": 0, 00:16:38.075 "w_mbytes_per_sec": 0 00:16:38.075 }, 00:16:38.075 "claimed": false, 00:16:38.075 "zoned": false, 00:16:38.075 "supported_io_types": { 00:16:38.075 "read": true, 00:16:38.075 "write": true, 00:16:38.075 "unmap": true, 00:16:38.075 "write_zeroes": true, 00:16:38.075 "flush": true, 00:16:38.075 "reset": true, 00:16:38.075 "compare": false, 00:16:38.075 "compare_and_write": false, 00:16:38.075 "abort": true, 00:16:38.075 "nvme_admin": false, 00:16:38.075 "nvme_io": false 00:16:38.075 }, 00:16:38.075 "memory_domains": [ 00:16:38.075 { 00:16:38.075 "dma_device_id": "system", 00:16:38.075 "dma_device_type": 1 00:16:38.075 }, 00:16:38.075 { 00:16:38.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.075 "dma_device_type": 2 00:16:38.075 } 00:16:38.075 ], 00:16:38.075 "driver_specific": {} 00:16:38.075 } 00:16:38.075 ] 00:16:38.075 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:38.075 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:38.075 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.075 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:38.075 BaseBdev4 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:38.076 10:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.337 10:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:38.598 [ 00:16:38.598 { 00:16:38.598 "name": "BaseBdev4", 00:16:38.598 "aliases": [ 00:16:38.598 "f38ebf0b-ccf6-4a1d-8a94-045417092612" 00:16:38.598 ], 00:16:38.598 "product_name": "Malloc disk", 00:16:38.598 "block_size": 512, 00:16:38.598 "num_blocks": 65536, 00:16:38.598 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:38.598 "assigned_rate_limits": { 00:16:38.598 "rw_ios_per_sec": 0, 00:16:38.598 "rw_mbytes_per_sec": 0, 00:16:38.598 "r_mbytes_per_sec": 0, 00:16:38.598 "w_mbytes_per_sec": 0 00:16:38.598 }, 00:16:38.598 "claimed": false, 00:16:38.598 "zoned": false, 00:16:38.598 "supported_io_types": { 00:16:38.598 "read": true, 00:16:38.598 "write": true, 00:16:38.598 "unmap": true, 00:16:38.598 "write_zeroes": true, 00:16:38.598 "flush": true, 00:16:38.598 "reset": true, 00:16:38.598 "compare": false, 00:16:38.598 "compare_and_write": false, 00:16:38.599 "abort": true, 00:16:38.599 "nvme_admin": false, 00:16:38.599 "nvme_io": false 00:16:38.599 }, 00:16:38.599 "memory_domains": [ 00:16:38.599 { 00:16:38.599 "dma_device_id": "system", 00:16:38.599 "dma_device_type": 1 00:16:38.599 }, 00:16:38.599 { 00:16:38.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.599 "dma_device_type": 2 00:16:38.599 } 00:16:38.599 ], 00:16:38.599 "driver_specific": {} 00:16:38.599 } 00:16:38.599 ] 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:38.599 [2024-06-10 10:12:00.416688] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:38.599 [2024-06-10 10:12:00.416718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:38.599 [2024-06-10 10:12:00.416732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:38.599 [2024-06-10 10:12:00.417769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:38.599 [2024-06-10 10:12:00.417800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.599 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.860 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.860 "name": "Existed_Raid", 00:16:38.860 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:38.860 "strip_size_kb": 64, 00:16:38.860 "state": "configuring", 00:16:38.860 "raid_level": "raid0", 00:16:38.860 "superblock": true, 00:16:38.860 "num_base_bdevs": 4, 00:16:38.860 "num_base_bdevs_discovered": 3, 00:16:38.860 "num_base_bdevs_operational": 4, 00:16:38.860 "base_bdevs_list": [ 00:16:38.860 { 00:16:38.860 "name": "BaseBdev1", 00:16:38.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:38.860 "is_configured": false, 00:16:38.860 "data_offset": 0, 00:16:38.860 "data_size": 0 00:16:38.860 }, 00:16:38.860 { 00:16:38.860 "name": "BaseBdev2", 00:16:38.860 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:38.860 "is_configured": true, 00:16:38.860 "data_offset": 2048, 00:16:38.860 "data_size": 63488 00:16:38.860 }, 00:16:38.860 { 00:16:38.860 "name": "BaseBdev3", 00:16:38.860 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:38.860 "is_configured": true, 00:16:38.860 "data_offset": 2048, 00:16:38.860 "data_size": 63488 00:16:38.860 }, 00:16:38.860 { 00:16:38.860 "name": "BaseBdev4", 00:16:38.860 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:38.860 "is_configured": true, 00:16:38.860 "data_offset": 2048, 00:16:38.860 "data_size": 63488 00:16:38.860 } 00:16:38.860 ] 00:16:38.860 }' 00:16:38.860 10:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.860 10:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:39.437 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:39.699 [2024-06-10 10:12:01.342994] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.699 "name": "Existed_Raid", 00:16:39.699 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:39.699 "strip_size_kb": 64, 00:16:39.699 "state": "configuring", 00:16:39.699 "raid_level": "raid0", 00:16:39.699 "superblock": true, 00:16:39.699 "num_base_bdevs": 4, 00:16:39.699 "num_base_bdevs_discovered": 2, 00:16:39.699 "num_base_bdevs_operational": 4, 00:16:39.699 "base_bdevs_list": [ 00:16:39.699 { 00:16:39.699 "name": "BaseBdev1", 00:16:39.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.699 "is_configured": false, 00:16:39.699 "data_offset": 0, 00:16:39.699 "data_size": 0 00:16:39.699 }, 00:16:39.699 { 00:16:39.699 "name": null, 00:16:39.699 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:39.699 "is_configured": false, 00:16:39.699 "data_offset": 2048, 00:16:39.699 "data_size": 63488 00:16:39.699 }, 00:16:39.699 { 00:16:39.699 "name": "BaseBdev3", 00:16:39.699 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:39.699 "is_configured": true, 00:16:39.699 "data_offset": 2048, 00:16:39.699 "data_size": 63488 00:16:39.699 }, 00:16:39.699 { 00:16:39.699 "name": "BaseBdev4", 00:16:39.699 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:39.699 "is_configured": true, 00:16:39.699 "data_offset": 2048, 00:16:39.699 "data_size": 63488 00:16:39.699 } 00:16:39.699 ] 00:16:39.699 }' 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.699 10:12:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.271 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:40.271 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.531 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:40.531 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:40.791 [2024-06-10 10:12:02.478859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:40.791 BaseBdev1 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.791 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:41.051 [ 00:16:41.051 { 00:16:41.051 "name": "BaseBdev1", 00:16:41.051 "aliases": [ 00:16:41.051 "37a6c5f1-a589-4662-9447-4d57d13e5653" 00:16:41.051 ], 00:16:41.051 "product_name": "Malloc disk", 00:16:41.051 "block_size": 512, 00:16:41.051 "num_blocks": 65536, 00:16:41.051 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:41.051 "assigned_rate_limits": { 00:16:41.051 "rw_ios_per_sec": 0, 00:16:41.051 "rw_mbytes_per_sec": 0, 00:16:41.051 "r_mbytes_per_sec": 0, 00:16:41.051 "w_mbytes_per_sec": 0 00:16:41.051 }, 00:16:41.051 "claimed": true, 00:16:41.051 "claim_type": "exclusive_write", 00:16:41.051 "zoned": false, 00:16:41.051 "supported_io_types": { 00:16:41.051 "read": true, 00:16:41.051 "write": true, 00:16:41.051 "unmap": true, 00:16:41.051 "write_zeroes": true, 00:16:41.051 "flush": true, 00:16:41.051 "reset": true, 00:16:41.051 "compare": false, 00:16:41.051 "compare_and_write": false, 00:16:41.051 "abort": true, 00:16:41.051 "nvme_admin": false, 00:16:41.051 "nvme_io": false 00:16:41.051 }, 00:16:41.051 "memory_domains": [ 00:16:41.051 { 00:16:41.051 "dma_device_id": "system", 00:16:41.051 "dma_device_type": 1 00:16:41.051 }, 00:16:41.051 { 00:16:41.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.051 "dma_device_type": 2 00:16:41.051 } 00:16:41.051 ], 00:16:41.051 "driver_specific": {} 00:16:41.051 } 00:16:41.051 ] 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.051 10:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.312 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.312 "name": "Existed_Raid", 00:16:41.312 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:41.312 "strip_size_kb": 64, 00:16:41.312 "state": "configuring", 00:16:41.312 "raid_level": "raid0", 00:16:41.312 "superblock": true, 00:16:41.312 "num_base_bdevs": 4, 00:16:41.312 "num_base_bdevs_discovered": 3, 00:16:41.312 "num_base_bdevs_operational": 4, 00:16:41.312 "base_bdevs_list": [ 00:16:41.312 { 00:16:41.312 "name": "BaseBdev1", 00:16:41.312 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:41.312 "is_configured": true, 00:16:41.312 "data_offset": 2048, 00:16:41.312 "data_size": 63488 00:16:41.312 }, 00:16:41.312 { 00:16:41.312 "name": null, 00:16:41.312 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:41.312 "is_configured": false, 00:16:41.312 "data_offset": 2048, 00:16:41.312 "data_size": 63488 00:16:41.312 }, 00:16:41.312 { 00:16:41.312 "name": "BaseBdev3", 00:16:41.312 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:41.312 "is_configured": true, 00:16:41.312 "data_offset": 2048, 00:16:41.312 "data_size": 63488 00:16:41.312 }, 00:16:41.312 { 00:16:41.312 "name": "BaseBdev4", 00:16:41.312 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:41.312 "is_configured": true, 00:16:41.312 "data_offset": 2048, 00:16:41.312 "data_size": 63488 00:16:41.312 } 00:16:41.312 ] 00:16:41.312 }' 00:16:41.312 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.312 10:12:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.884 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.884 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:41.884 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:41.884 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:42.145 [2024-06-10 10:12:03.902481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.145 10:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.406 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.406 "name": "Existed_Raid", 00:16:42.406 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:42.406 "strip_size_kb": 64, 00:16:42.406 "state": "configuring", 00:16:42.406 "raid_level": "raid0", 00:16:42.406 "superblock": true, 00:16:42.406 "num_base_bdevs": 4, 00:16:42.406 "num_base_bdevs_discovered": 2, 00:16:42.406 "num_base_bdevs_operational": 4, 00:16:42.406 "base_bdevs_list": [ 00:16:42.406 { 00:16:42.406 "name": "BaseBdev1", 00:16:42.406 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:42.406 "is_configured": true, 00:16:42.406 "data_offset": 2048, 00:16:42.406 "data_size": 63488 00:16:42.406 }, 00:16:42.406 { 00:16:42.406 "name": null, 00:16:42.406 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:42.406 "is_configured": false, 00:16:42.406 "data_offset": 2048, 00:16:42.406 "data_size": 63488 00:16:42.406 }, 00:16:42.406 { 00:16:42.406 "name": null, 00:16:42.406 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:42.406 "is_configured": false, 00:16:42.406 "data_offset": 2048, 00:16:42.406 "data_size": 63488 00:16:42.406 }, 00:16:42.406 { 00:16:42.406 "name": "BaseBdev4", 00:16:42.406 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:42.406 "is_configured": true, 00:16:42.406 "data_offset": 2048, 00:16:42.406 "data_size": 63488 00:16:42.406 } 00:16:42.406 ] 00:16:42.406 }' 00:16:42.406 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.406 10:12:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.978 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.978 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:42.978 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:42.978 10:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:43.239 [2024-06-10 10:12:05.005283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.239 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.501 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.501 "name": "Existed_Raid", 00:16:43.501 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:43.501 "strip_size_kb": 64, 00:16:43.501 "state": "configuring", 00:16:43.501 "raid_level": "raid0", 00:16:43.501 "superblock": true, 00:16:43.501 "num_base_bdevs": 4, 00:16:43.501 "num_base_bdevs_discovered": 3, 00:16:43.501 "num_base_bdevs_operational": 4, 00:16:43.501 "base_bdevs_list": [ 00:16:43.501 { 00:16:43.501 "name": "BaseBdev1", 00:16:43.501 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:43.501 "is_configured": true, 00:16:43.501 "data_offset": 2048, 00:16:43.501 "data_size": 63488 00:16:43.501 }, 00:16:43.501 { 00:16:43.501 "name": null, 00:16:43.501 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:43.501 "is_configured": false, 00:16:43.501 "data_offset": 2048, 00:16:43.501 "data_size": 63488 00:16:43.501 }, 00:16:43.501 { 00:16:43.501 "name": "BaseBdev3", 00:16:43.501 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:43.501 "is_configured": true, 00:16:43.501 "data_offset": 2048, 00:16:43.501 "data_size": 63488 00:16:43.501 }, 00:16:43.501 { 00:16:43.501 "name": "BaseBdev4", 00:16:43.501 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:43.501 "is_configured": true, 00:16:43.501 "data_offset": 2048, 00:16:43.501 "data_size": 63488 00:16:43.501 } 00:16:43.501 ] 00:16:43.501 }' 00:16:43.501 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.501 10:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.073 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:44.073 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.334 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:44.334 10:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:44.334 [2024-06-10 10:12:06.120117] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.334 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.594 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.594 "name": "Existed_Raid", 00:16:44.594 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:44.594 "strip_size_kb": 64, 00:16:44.594 "state": "configuring", 00:16:44.594 "raid_level": "raid0", 00:16:44.594 "superblock": true, 00:16:44.594 "num_base_bdevs": 4, 00:16:44.594 "num_base_bdevs_discovered": 2, 00:16:44.594 "num_base_bdevs_operational": 4, 00:16:44.594 "base_bdevs_list": [ 00:16:44.594 { 00:16:44.594 "name": null, 00:16:44.594 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:44.594 "is_configured": false, 00:16:44.594 "data_offset": 2048, 00:16:44.594 "data_size": 63488 00:16:44.594 }, 00:16:44.594 { 00:16:44.594 "name": null, 00:16:44.594 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:44.594 "is_configured": false, 00:16:44.594 "data_offset": 2048, 00:16:44.594 "data_size": 63488 00:16:44.594 }, 00:16:44.594 { 00:16:44.594 "name": "BaseBdev3", 00:16:44.594 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:44.594 "is_configured": true, 00:16:44.594 "data_offset": 2048, 00:16:44.594 "data_size": 63488 00:16:44.594 }, 00:16:44.594 { 00:16:44.594 "name": "BaseBdev4", 00:16:44.594 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:44.594 "is_configured": true, 00:16:44.594 "data_offset": 2048, 00:16:44.594 "data_size": 63488 00:16:44.594 } 00:16:44.594 ] 00:16:44.594 }' 00:16:44.594 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.594 10:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.166 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.166 10:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:45.428 [2024-06-10 10:12:07.244768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.428 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.690 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.690 "name": "Existed_Raid", 00:16:45.690 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:45.690 "strip_size_kb": 64, 00:16:45.690 "state": "configuring", 00:16:45.690 "raid_level": "raid0", 00:16:45.690 "superblock": true, 00:16:45.690 "num_base_bdevs": 4, 00:16:45.690 "num_base_bdevs_discovered": 3, 00:16:45.690 "num_base_bdevs_operational": 4, 00:16:45.690 "base_bdevs_list": [ 00:16:45.690 { 00:16:45.690 "name": null, 00:16:45.690 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:45.690 "is_configured": false, 00:16:45.690 "data_offset": 2048, 00:16:45.690 "data_size": 63488 00:16:45.690 }, 00:16:45.690 { 00:16:45.690 "name": "BaseBdev2", 00:16:45.690 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:45.690 "is_configured": true, 00:16:45.690 "data_offset": 2048, 00:16:45.690 "data_size": 63488 00:16:45.690 }, 00:16:45.690 { 00:16:45.690 "name": "BaseBdev3", 00:16:45.690 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:45.690 "is_configured": true, 00:16:45.690 "data_offset": 2048, 00:16:45.690 "data_size": 63488 00:16:45.690 }, 00:16:45.690 { 00:16:45.690 "name": "BaseBdev4", 00:16:45.690 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:45.690 "is_configured": true, 00:16:45.690 "data_offset": 2048, 00:16:45.690 "data_size": 63488 00:16:45.690 } 00:16:45.690 ] 00:16:45.690 }' 00:16:45.690 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.690 10:12:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.261 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.261 10:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:46.522 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:46.522 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.522 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:46.522 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 37a6c5f1-a589-4662-9447-4d57d13e5653 00:16:46.782 [2024-06-10 10:12:08.553047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:46.782 [2024-06-10 10:12:08.553168] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c32a00 00:16:46.783 [2024-06-10 10:12:08.553177] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:46.783 [2024-06-10 10:12:08.553315] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x192e240 00:16:46.783 [2024-06-10 10:12:08.553401] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c32a00 00:16:46.783 [2024-06-10 10:12:08.553407] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c32a00 00:16:46.783 [2024-06-10 10:12:08.553473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:46.783 NewBaseBdev 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:46.783 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.043 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:47.303 [ 00:16:47.303 { 00:16:47.303 "name": "NewBaseBdev", 00:16:47.303 "aliases": [ 00:16:47.303 "37a6c5f1-a589-4662-9447-4d57d13e5653" 00:16:47.303 ], 00:16:47.303 "product_name": "Malloc disk", 00:16:47.303 "block_size": 512, 00:16:47.303 "num_blocks": 65536, 00:16:47.303 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:47.303 "assigned_rate_limits": { 00:16:47.303 "rw_ios_per_sec": 0, 00:16:47.303 "rw_mbytes_per_sec": 0, 00:16:47.303 "r_mbytes_per_sec": 0, 00:16:47.303 "w_mbytes_per_sec": 0 00:16:47.304 }, 00:16:47.304 "claimed": true, 00:16:47.304 "claim_type": "exclusive_write", 00:16:47.304 "zoned": false, 00:16:47.304 "supported_io_types": { 00:16:47.304 "read": true, 00:16:47.304 "write": true, 00:16:47.304 "unmap": true, 00:16:47.304 "write_zeroes": true, 00:16:47.304 "flush": true, 00:16:47.304 "reset": true, 00:16:47.304 "compare": false, 00:16:47.304 "compare_and_write": false, 00:16:47.304 "abort": true, 00:16:47.304 "nvme_admin": false, 00:16:47.304 "nvme_io": false 00:16:47.304 }, 00:16:47.304 "memory_domains": [ 00:16:47.304 { 00:16:47.304 "dma_device_id": "system", 00:16:47.304 "dma_device_type": 1 00:16:47.304 }, 00:16:47.304 { 00:16:47.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.304 "dma_device_type": 2 00:16:47.304 } 00:16:47.304 ], 00:16:47.304 "driver_specific": {} 00:16:47.304 } 00:16:47.304 ] 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.304 10:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.304 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.304 "name": "Existed_Raid", 00:16:47.304 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:47.304 "strip_size_kb": 64, 00:16:47.304 "state": "online", 00:16:47.304 "raid_level": "raid0", 00:16:47.304 "superblock": true, 00:16:47.304 "num_base_bdevs": 4, 00:16:47.304 "num_base_bdevs_discovered": 4, 00:16:47.304 "num_base_bdevs_operational": 4, 00:16:47.304 "base_bdevs_list": [ 00:16:47.304 { 00:16:47.304 "name": "NewBaseBdev", 00:16:47.304 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:47.304 "is_configured": true, 00:16:47.304 "data_offset": 2048, 00:16:47.304 "data_size": 63488 00:16:47.304 }, 00:16:47.304 { 00:16:47.304 "name": "BaseBdev2", 00:16:47.304 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:47.304 "is_configured": true, 00:16:47.304 "data_offset": 2048, 00:16:47.304 "data_size": 63488 00:16:47.304 }, 00:16:47.304 { 00:16:47.304 "name": "BaseBdev3", 00:16:47.304 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:47.304 "is_configured": true, 00:16:47.304 "data_offset": 2048, 00:16:47.304 "data_size": 63488 00:16:47.304 }, 00:16:47.304 { 00:16:47.304 "name": "BaseBdev4", 00:16:47.304 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:47.304 "is_configured": true, 00:16:47.304 "data_offset": 2048, 00:16:47.304 "data_size": 63488 00:16:47.304 } 00:16:47.304 ] 00:16:47.304 }' 00:16:47.304 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.304 10:12:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:47.875 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:48.136 [2024-06-10 10:12:09.820480] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:48.136 "name": "Existed_Raid", 00:16:48.136 "aliases": [ 00:16:48.136 "eea59e69-3627-4dee-a184-7180e1935655" 00:16:48.136 ], 00:16:48.136 "product_name": "Raid Volume", 00:16:48.136 "block_size": 512, 00:16:48.136 "num_blocks": 253952, 00:16:48.136 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:48.136 "assigned_rate_limits": { 00:16:48.136 "rw_ios_per_sec": 0, 00:16:48.136 "rw_mbytes_per_sec": 0, 00:16:48.136 "r_mbytes_per_sec": 0, 00:16:48.136 "w_mbytes_per_sec": 0 00:16:48.136 }, 00:16:48.136 "claimed": false, 00:16:48.136 "zoned": false, 00:16:48.136 "supported_io_types": { 00:16:48.136 "read": true, 00:16:48.136 "write": true, 00:16:48.136 "unmap": true, 00:16:48.136 "write_zeroes": true, 00:16:48.136 "flush": true, 00:16:48.136 "reset": true, 00:16:48.136 "compare": false, 00:16:48.136 "compare_and_write": false, 00:16:48.136 "abort": false, 00:16:48.136 "nvme_admin": false, 00:16:48.136 "nvme_io": false 00:16:48.136 }, 00:16:48.136 "memory_domains": [ 00:16:48.136 { 00:16:48.136 "dma_device_id": "system", 00:16:48.136 "dma_device_type": 1 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.136 "dma_device_type": 2 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "system", 00:16:48.136 "dma_device_type": 1 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.136 "dma_device_type": 2 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "system", 00:16:48.136 "dma_device_type": 1 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.136 "dma_device_type": 2 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "system", 00:16:48.136 "dma_device_type": 1 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.136 "dma_device_type": 2 00:16:48.136 } 00:16:48.136 ], 00:16:48.136 "driver_specific": { 00:16:48.136 "raid": { 00:16:48.136 "uuid": "eea59e69-3627-4dee-a184-7180e1935655", 00:16:48.136 "strip_size_kb": 64, 00:16:48.136 "state": "online", 00:16:48.136 "raid_level": "raid0", 00:16:48.136 "superblock": true, 00:16:48.136 "num_base_bdevs": 4, 00:16:48.136 "num_base_bdevs_discovered": 4, 00:16:48.136 "num_base_bdevs_operational": 4, 00:16:48.136 "base_bdevs_list": [ 00:16:48.136 { 00:16:48.136 "name": "NewBaseBdev", 00:16:48.136 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:48.136 "is_configured": true, 00:16:48.136 "data_offset": 2048, 00:16:48.136 "data_size": 63488 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "name": "BaseBdev2", 00:16:48.136 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:48.136 "is_configured": true, 00:16:48.136 "data_offset": 2048, 00:16:48.136 "data_size": 63488 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "name": "BaseBdev3", 00:16:48.136 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:48.136 "is_configured": true, 00:16:48.136 "data_offset": 2048, 00:16:48.136 "data_size": 63488 00:16:48.136 }, 00:16:48.136 { 00:16:48.136 "name": "BaseBdev4", 00:16:48.136 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:48.136 "is_configured": true, 00:16:48.136 "data_offset": 2048, 00:16:48.136 "data_size": 63488 00:16:48.136 } 00:16:48.136 ] 00:16:48.136 } 00:16:48.136 } 00:16:48.136 }' 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:48.136 BaseBdev2 00:16:48.136 BaseBdev3 00:16:48.136 BaseBdev4' 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:48.136 10:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.397 "name": "NewBaseBdev", 00:16:48.397 "aliases": [ 00:16:48.397 "37a6c5f1-a589-4662-9447-4d57d13e5653" 00:16:48.397 ], 00:16:48.397 "product_name": "Malloc disk", 00:16:48.397 "block_size": 512, 00:16:48.397 "num_blocks": 65536, 00:16:48.397 "uuid": "37a6c5f1-a589-4662-9447-4d57d13e5653", 00:16:48.397 "assigned_rate_limits": { 00:16:48.397 "rw_ios_per_sec": 0, 00:16:48.397 "rw_mbytes_per_sec": 0, 00:16:48.397 "r_mbytes_per_sec": 0, 00:16:48.397 "w_mbytes_per_sec": 0 00:16:48.397 }, 00:16:48.397 "claimed": true, 00:16:48.397 "claim_type": "exclusive_write", 00:16:48.397 "zoned": false, 00:16:48.397 "supported_io_types": { 00:16:48.397 "read": true, 00:16:48.397 "write": true, 00:16:48.397 "unmap": true, 00:16:48.397 "write_zeroes": true, 00:16:48.397 "flush": true, 00:16:48.397 "reset": true, 00:16:48.397 "compare": false, 00:16:48.397 "compare_and_write": false, 00:16:48.397 "abort": true, 00:16:48.397 "nvme_admin": false, 00:16:48.397 "nvme_io": false 00:16:48.397 }, 00:16:48.397 "memory_domains": [ 00:16:48.397 { 00:16:48.397 "dma_device_id": "system", 00:16:48.397 "dma_device_type": 1 00:16:48.397 }, 00:16:48.397 { 00:16:48.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.397 "dma_device_type": 2 00:16:48.397 } 00:16:48.397 ], 00:16:48.397 "driver_specific": {} 00:16:48.397 }' 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:48.397 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.658 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.941 "name": "BaseBdev2", 00:16:48.941 "aliases": [ 00:16:48.941 "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c" 00:16:48.941 ], 00:16:48.941 "product_name": "Malloc disk", 00:16:48.941 "block_size": 512, 00:16:48.941 "num_blocks": 65536, 00:16:48.941 "uuid": "9f9a67f4-4af6-4d75-b4f0-19471e27bc7c", 00:16:48.941 "assigned_rate_limits": { 00:16:48.941 "rw_ios_per_sec": 0, 00:16:48.941 "rw_mbytes_per_sec": 0, 00:16:48.941 "r_mbytes_per_sec": 0, 00:16:48.941 "w_mbytes_per_sec": 0 00:16:48.941 }, 00:16:48.941 "claimed": true, 00:16:48.941 "claim_type": "exclusive_write", 00:16:48.941 "zoned": false, 00:16:48.941 "supported_io_types": { 00:16:48.941 "read": true, 00:16:48.941 "write": true, 00:16:48.941 "unmap": true, 00:16:48.941 "write_zeroes": true, 00:16:48.941 "flush": true, 00:16:48.941 "reset": true, 00:16:48.941 "compare": false, 00:16:48.941 "compare_and_write": false, 00:16:48.941 "abort": true, 00:16:48.941 "nvme_admin": false, 00:16:48.941 "nvme_io": false 00:16:48.941 }, 00:16:48.941 "memory_domains": [ 00:16:48.941 { 00:16:48.941 "dma_device_id": "system", 00:16:48.941 "dma_device_type": 1 00:16:48.941 }, 00:16:48.941 { 00:16:48.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.941 "dma_device_type": 2 00:16:48.941 } 00:16:48.941 ], 00:16:48.941 "driver_specific": {} 00:16:48.941 }' 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:48.941 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:49.202 10:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:49.483 "name": "BaseBdev3", 00:16:49.483 "aliases": [ 00:16:49.483 "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310" 00:16:49.483 ], 00:16:49.483 "product_name": "Malloc disk", 00:16:49.483 "block_size": 512, 00:16:49.483 "num_blocks": 65536, 00:16:49.483 "uuid": "dc76b5cc-6bdc-49d4-9e3c-0b6d845b6310", 00:16:49.483 "assigned_rate_limits": { 00:16:49.483 "rw_ios_per_sec": 0, 00:16:49.483 "rw_mbytes_per_sec": 0, 00:16:49.483 "r_mbytes_per_sec": 0, 00:16:49.483 "w_mbytes_per_sec": 0 00:16:49.483 }, 00:16:49.483 "claimed": true, 00:16:49.483 "claim_type": "exclusive_write", 00:16:49.483 "zoned": false, 00:16:49.483 "supported_io_types": { 00:16:49.483 "read": true, 00:16:49.483 "write": true, 00:16:49.483 "unmap": true, 00:16:49.483 "write_zeroes": true, 00:16:49.483 "flush": true, 00:16:49.483 "reset": true, 00:16:49.483 "compare": false, 00:16:49.483 "compare_and_write": false, 00:16:49.483 "abort": true, 00:16:49.483 "nvme_admin": false, 00:16:49.483 "nvme_io": false 00:16:49.483 }, 00:16:49.483 "memory_domains": [ 00:16:49.483 { 00:16:49.483 "dma_device_id": "system", 00:16:49.483 "dma_device_type": 1 00:16:49.483 }, 00:16:49.483 { 00:16:49.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.483 "dma_device_type": 2 00:16:49.483 } 00:16:49.483 ], 00:16:49.483 "driver_specific": {} 00:16:49.483 }' 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.483 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:49.747 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.007 "name": "BaseBdev4", 00:16:50.007 "aliases": [ 00:16:50.007 "f38ebf0b-ccf6-4a1d-8a94-045417092612" 00:16:50.007 ], 00:16:50.007 "product_name": "Malloc disk", 00:16:50.007 "block_size": 512, 00:16:50.007 "num_blocks": 65536, 00:16:50.007 "uuid": "f38ebf0b-ccf6-4a1d-8a94-045417092612", 00:16:50.007 "assigned_rate_limits": { 00:16:50.007 "rw_ios_per_sec": 0, 00:16:50.007 "rw_mbytes_per_sec": 0, 00:16:50.007 "r_mbytes_per_sec": 0, 00:16:50.007 "w_mbytes_per_sec": 0 00:16:50.007 }, 00:16:50.007 "claimed": true, 00:16:50.007 "claim_type": "exclusive_write", 00:16:50.007 "zoned": false, 00:16:50.007 "supported_io_types": { 00:16:50.007 "read": true, 00:16:50.007 "write": true, 00:16:50.007 "unmap": true, 00:16:50.007 "write_zeroes": true, 00:16:50.007 "flush": true, 00:16:50.007 "reset": true, 00:16:50.007 "compare": false, 00:16:50.007 "compare_and_write": false, 00:16:50.007 "abort": true, 00:16:50.007 "nvme_admin": false, 00:16:50.007 "nvme_io": false 00:16:50.007 }, 00:16:50.007 "memory_domains": [ 00:16:50.007 { 00:16:50.007 "dma_device_id": "system", 00:16:50.007 "dma_device_type": 1 00:16:50.007 }, 00:16:50.007 { 00:16:50.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.007 "dma_device_type": 2 00:16:50.007 } 00:16:50.007 ], 00:16:50.007 "driver_specific": {} 00:16:50.007 }' 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.007 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.267 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.267 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.267 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.267 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.267 10:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.267 10:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.267 10:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:50.267 10:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:50.526 [2024-06-10 10:12:12.222361] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:50.526 [2024-06-10 10:12:12.222385] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:50.526 [2024-06-10 10:12:12.222431] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:50.526 [2024-06-10 10:12:12.222483] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:50.526 [2024-06-10 10:12:12.222489] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c32a00 name Existed_Raid, state offline 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1026437 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1026437 ']' 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1026437 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1026437 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1026437' 00:16:50.526 killing process with pid 1026437 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1026437 00:16:50.526 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1026437 00:16:50.526 [2024-06-10 10:12:12.288910] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:50.526 [2024-06-10 10:12:12.316988] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:50.786 10:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:50.786 00:16:50.786 real 0m26.219s 00:16:50.786 user 0m49.161s 00:16:50.786 sys 0m3.859s 00:16:50.786 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:50.786 10:12:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.786 ************************************ 00:16:50.786 END TEST raid_state_function_test_sb 00:16:50.786 ************************************ 00:16:50.786 10:12:12 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:16:50.786 10:12:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:16:50.786 10:12:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:50.786 10:12:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:50.786 ************************************ 00:16:50.786 START TEST raid_superblock_test 00:16:50.786 ************************************ 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 4 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1032112 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1032112 /var/tmp/spdk-raid.sock 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1032112 ']' 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:50.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:50.786 10:12:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.786 [2024-06-10 10:12:12.594801] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:16:50.786 [2024-06-10 10:12:12.594867] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1032112 ] 00:16:51.046 [2024-06-10 10:12:12.685403] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:51.046 [2024-06-10 10:12:12.755247] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:51.046 [2024-06-10 10:12:12.796646] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:51.046 [2024-06-10 10:12:12.796669] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:51.616 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:51.876 malloc1 00:16:51.876 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:52.137 [2024-06-10 10:12:13.794711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:52.137 [2024-06-10 10:12:13.794744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.137 [2024-06-10 10:12:13.794755] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b5990 00:16:52.137 [2024-06-10 10:12:13.794761] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.137 [2024-06-10 10:12:13.796069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.137 [2024-06-10 10:12:13.796088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:52.137 pt1 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.137 10:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:52.137 malloc2 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:52.397 [2024-06-10 10:12:14.177724] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:52.397 [2024-06-10 10:12:14.177755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.397 [2024-06-10 10:12:14.177766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b64e0 00:16:52.397 [2024-06-10 10:12:14.177772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.397 [2024-06-10 10:12:14.178977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.397 [2024-06-10 10:12:14.178995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:52.397 pt2 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.397 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:52.657 malloc3 00:16:52.658 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:52.917 [2024-06-10 10:12:14.564663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:52.918 [2024-06-10 10:12:14.564692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.918 [2024-06-10 10:12:14.564701] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25624e0 00:16:52.918 [2024-06-10 10:12:14.564708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.918 [2024-06-10 10:12:14.565894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.918 [2024-06-10 10:12:14.565912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:52.918 pt3 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:52.918 malloc4 00:16:52.918 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:53.179 [2024-06-10 10:12:14.943479] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:53.179 [2024-06-10 10:12:14.943507] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.179 [2024-06-10 10:12:14.943516] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2564f50 00:16:53.179 [2024-06-10 10:12:14.943522] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.179 [2024-06-10 10:12:14.944697] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.179 [2024-06-10 10:12:14.944715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:53.179 pt4 00:16:53.179 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:53.179 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:53.179 10:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:53.439 [2024-06-10 10:12:15.135984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:53.439 [2024-06-10 10:12:15.136995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:53.439 [2024-06-10 10:12:15.137037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:53.439 [2024-06-10 10:12:15.137071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:53.439 [2024-06-10 10:12:15.137204] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2566fb0 00:16:53.439 [2024-06-10 10:12:15.137211] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:53.439 [2024-06-10 10:12:15.137363] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23b6df0 00:16:53.439 [2024-06-10 10:12:15.137472] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2566fb0 00:16:53.439 [2024-06-10 10:12:15.137477] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2566fb0 00:16:53.439 [2024-06-10 10:12:15.137545] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:53.439 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:53.439 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:53.439 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.439 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:53.440 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.699 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.699 "name": "raid_bdev1", 00:16:53.699 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:16:53.699 "strip_size_kb": 64, 00:16:53.699 "state": "online", 00:16:53.699 "raid_level": "raid0", 00:16:53.699 "superblock": true, 00:16:53.700 "num_base_bdevs": 4, 00:16:53.700 "num_base_bdevs_discovered": 4, 00:16:53.700 "num_base_bdevs_operational": 4, 00:16:53.700 "base_bdevs_list": [ 00:16:53.700 { 00:16:53.700 "name": "pt1", 00:16:53.700 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:53.700 "is_configured": true, 00:16:53.700 "data_offset": 2048, 00:16:53.700 "data_size": 63488 00:16:53.700 }, 00:16:53.700 { 00:16:53.700 "name": "pt2", 00:16:53.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:53.700 "is_configured": true, 00:16:53.700 "data_offset": 2048, 00:16:53.700 "data_size": 63488 00:16:53.700 }, 00:16:53.700 { 00:16:53.700 "name": "pt3", 00:16:53.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:53.700 "is_configured": true, 00:16:53.700 "data_offset": 2048, 00:16:53.700 "data_size": 63488 00:16:53.700 }, 00:16:53.700 { 00:16:53.700 "name": "pt4", 00:16:53.700 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:53.700 "is_configured": true, 00:16:53.700 "data_offset": 2048, 00:16:53.700 "data_size": 63488 00:16:53.700 } 00:16:53.700 ] 00:16:53.700 }' 00:16:53.700 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.700 10:12:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:54.270 10:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:54.270 [2024-06-10 10:12:16.074570] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:54.270 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:54.270 "name": "raid_bdev1", 00:16:54.270 "aliases": [ 00:16:54.270 "59bd94f9-3457-4098-b50c-52787b56c90d" 00:16:54.270 ], 00:16:54.270 "product_name": "Raid Volume", 00:16:54.270 "block_size": 512, 00:16:54.270 "num_blocks": 253952, 00:16:54.270 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:16:54.270 "assigned_rate_limits": { 00:16:54.270 "rw_ios_per_sec": 0, 00:16:54.270 "rw_mbytes_per_sec": 0, 00:16:54.270 "r_mbytes_per_sec": 0, 00:16:54.270 "w_mbytes_per_sec": 0 00:16:54.270 }, 00:16:54.270 "claimed": false, 00:16:54.270 "zoned": false, 00:16:54.270 "supported_io_types": { 00:16:54.270 "read": true, 00:16:54.270 "write": true, 00:16:54.270 "unmap": true, 00:16:54.270 "write_zeroes": true, 00:16:54.270 "flush": true, 00:16:54.270 "reset": true, 00:16:54.270 "compare": false, 00:16:54.270 "compare_and_write": false, 00:16:54.270 "abort": false, 00:16:54.270 "nvme_admin": false, 00:16:54.270 "nvme_io": false 00:16:54.270 }, 00:16:54.270 "memory_domains": [ 00:16:54.270 { 00:16:54.270 "dma_device_id": "system", 00:16:54.270 "dma_device_type": 1 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.270 "dma_device_type": 2 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "system", 00:16:54.270 "dma_device_type": 1 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.270 "dma_device_type": 2 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "system", 00:16:54.270 "dma_device_type": 1 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.270 "dma_device_type": 2 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "system", 00:16:54.270 "dma_device_type": 1 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.270 "dma_device_type": 2 00:16:54.270 } 00:16:54.270 ], 00:16:54.270 "driver_specific": { 00:16:54.270 "raid": { 00:16:54.270 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:16:54.270 "strip_size_kb": 64, 00:16:54.270 "state": "online", 00:16:54.270 "raid_level": "raid0", 00:16:54.270 "superblock": true, 00:16:54.270 "num_base_bdevs": 4, 00:16:54.270 "num_base_bdevs_discovered": 4, 00:16:54.270 "num_base_bdevs_operational": 4, 00:16:54.270 "base_bdevs_list": [ 00:16:54.270 { 00:16:54.270 "name": "pt1", 00:16:54.270 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.270 "is_configured": true, 00:16:54.270 "data_offset": 2048, 00:16:54.270 "data_size": 63488 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "name": "pt2", 00:16:54.270 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:54.270 "is_configured": true, 00:16:54.270 "data_offset": 2048, 00:16:54.270 "data_size": 63488 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "name": "pt3", 00:16:54.270 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:54.270 "is_configured": true, 00:16:54.270 "data_offset": 2048, 00:16:54.270 "data_size": 63488 00:16:54.270 }, 00:16:54.270 { 00:16:54.270 "name": "pt4", 00:16:54.270 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:54.270 "is_configured": true, 00:16:54.270 "data_offset": 2048, 00:16:54.270 "data_size": 63488 00:16:54.271 } 00:16:54.271 ] 00:16:54.271 } 00:16:54.271 } 00:16:54.271 }' 00:16:54.271 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:54.531 pt2 00:16:54.531 pt3 00:16:54.531 pt4' 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.531 "name": "pt1", 00:16:54.531 "aliases": [ 00:16:54.531 "00000000-0000-0000-0000-000000000001" 00:16:54.531 ], 00:16:54.531 "product_name": "passthru", 00:16:54.531 "block_size": 512, 00:16:54.531 "num_blocks": 65536, 00:16:54.531 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.531 "assigned_rate_limits": { 00:16:54.531 "rw_ios_per_sec": 0, 00:16:54.531 "rw_mbytes_per_sec": 0, 00:16:54.531 "r_mbytes_per_sec": 0, 00:16:54.531 "w_mbytes_per_sec": 0 00:16:54.531 }, 00:16:54.531 "claimed": true, 00:16:54.531 "claim_type": "exclusive_write", 00:16:54.531 "zoned": false, 00:16:54.531 "supported_io_types": { 00:16:54.531 "read": true, 00:16:54.531 "write": true, 00:16:54.531 "unmap": true, 00:16:54.531 "write_zeroes": true, 00:16:54.531 "flush": true, 00:16:54.531 "reset": true, 00:16:54.531 "compare": false, 00:16:54.531 "compare_and_write": false, 00:16:54.531 "abort": true, 00:16:54.531 "nvme_admin": false, 00:16:54.531 "nvme_io": false 00:16:54.531 }, 00:16:54.531 "memory_domains": [ 00:16:54.531 { 00:16:54.531 "dma_device_id": "system", 00:16:54.531 "dma_device_type": 1 00:16:54.531 }, 00:16:54.531 { 00:16:54.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.531 "dma_device_type": 2 00:16:54.531 } 00:16:54.531 ], 00:16:54.531 "driver_specific": { 00:16:54.531 "passthru": { 00:16:54.531 "name": "pt1", 00:16:54.531 "base_bdev_name": "malloc1" 00:16:54.531 } 00:16:54.531 } 00:16:54.531 }' 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.531 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.791 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.052 "name": "pt2", 00:16:55.052 "aliases": [ 00:16:55.052 "00000000-0000-0000-0000-000000000002" 00:16:55.052 ], 00:16:55.052 "product_name": "passthru", 00:16:55.052 "block_size": 512, 00:16:55.052 "num_blocks": 65536, 00:16:55.052 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:55.052 "assigned_rate_limits": { 00:16:55.052 "rw_ios_per_sec": 0, 00:16:55.052 "rw_mbytes_per_sec": 0, 00:16:55.052 "r_mbytes_per_sec": 0, 00:16:55.052 "w_mbytes_per_sec": 0 00:16:55.052 }, 00:16:55.052 "claimed": true, 00:16:55.052 "claim_type": "exclusive_write", 00:16:55.052 "zoned": false, 00:16:55.052 "supported_io_types": { 00:16:55.052 "read": true, 00:16:55.052 "write": true, 00:16:55.052 "unmap": true, 00:16:55.052 "write_zeroes": true, 00:16:55.052 "flush": true, 00:16:55.052 "reset": true, 00:16:55.052 "compare": false, 00:16:55.052 "compare_and_write": false, 00:16:55.052 "abort": true, 00:16:55.052 "nvme_admin": false, 00:16:55.052 "nvme_io": false 00:16:55.052 }, 00:16:55.052 "memory_domains": [ 00:16:55.052 { 00:16:55.052 "dma_device_id": "system", 00:16:55.052 "dma_device_type": 1 00:16:55.052 }, 00:16:55.052 { 00:16:55.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.052 "dma_device_type": 2 00:16:55.052 } 00:16:55.052 ], 00:16:55.052 "driver_specific": { 00:16:55.052 "passthru": { 00:16:55.052 "name": "pt2", 00:16:55.052 "base_bdev_name": "malloc2" 00:16:55.052 } 00:16:55.052 } 00:16:55.052 }' 00:16:55.052 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.312 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.312 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.312 10:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.312 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.572 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.572 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.572 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.572 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.572 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:55.832 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.832 "name": "pt3", 00:16:55.832 "aliases": [ 00:16:55.832 "00000000-0000-0000-0000-000000000003" 00:16:55.832 ], 00:16:55.832 "product_name": "passthru", 00:16:55.832 "block_size": 512, 00:16:55.832 "num_blocks": 65536, 00:16:55.832 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:55.832 "assigned_rate_limits": { 00:16:55.832 "rw_ios_per_sec": 0, 00:16:55.832 "rw_mbytes_per_sec": 0, 00:16:55.832 "r_mbytes_per_sec": 0, 00:16:55.832 "w_mbytes_per_sec": 0 00:16:55.832 }, 00:16:55.832 "claimed": true, 00:16:55.832 "claim_type": "exclusive_write", 00:16:55.832 "zoned": false, 00:16:55.832 "supported_io_types": { 00:16:55.832 "read": true, 00:16:55.832 "write": true, 00:16:55.832 "unmap": true, 00:16:55.832 "write_zeroes": true, 00:16:55.832 "flush": true, 00:16:55.832 "reset": true, 00:16:55.832 "compare": false, 00:16:55.832 "compare_and_write": false, 00:16:55.832 "abort": true, 00:16:55.832 "nvme_admin": false, 00:16:55.832 "nvme_io": false 00:16:55.832 }, 00:16:55.832 "memory_domains": [ 00:16:55.832 { 00:16:55.832 "dma_device_id": "system", 00:16:55.832 "dma_device_type": 1 00:16:55.832 }, 00:16:55.833 { 00:16:55.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.833 "dma_device_type": 2 00:16:55.833 } 00:16:55.833 ], 00:16:55.833 "driver_specific": { 00:16:55.833 "passthru": { 00:16:55.833 "name": "pt3", 00:16:55.833 "base_bdev_name": "malloc3" 00:16:55.833 } 00:16:55.833 } 00:16:55.833 }' 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.833 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:56.092 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:56.352 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:56.352 "name": "pt4", 00:16:56.352 "aliases": [ 00:16:56.352 "00000000-0000-0000-0000-000000000004" 00:16:56.352 ], 00:16:56.352 "product_name": "passthru", 00:16:56.352 "block_size": 512, 00:16:56.352 "num_blocks": 65536, 00:16:56.352 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:56.352 "assigned_rate_limits": { 00:16:56.352 "rw_ios_per_sec": 0, 00:16:56.352 "rw_mbytes_per_sec": 0, 00:16:56.352 "r_mbytes_per_sec": 0, 00:16:56.352 "w_mbytes_per_sec": 0 00:16:56.352 }, 00:16:56.352 "claimed": true, 00:16:56.352 "claim_type": "exclusive_write", 00:16:56.352 "zoned": false, 00:16:56.352 "supported_io_types": { 00:16:56.352 "read": true, 00:16:56.352 "write": true, 00:16:56.352 "unmap": true, 00:16:56.352 "write_zeroes": true, 00:16:56.352 "flush": true, 00:16:56.352 "reset": true, 00:16:56.352 "compare": false, 00:16:56.352 "compare_and_write": false, 00:16:56.352 "abort": true, 00:16:56.352 "nvme_admin": false, 00:16:56.352 "nvme_io": false 00:16:56.352 }, 00:16:56.352 "memory_domains": [ 00:16:56.352 { 00:16:56.352 "dma_device_id": "system", 00:16:56.352 "dma_device_type": 1 00:16:56.352 }, 00:16:56.352 { 00:16:56.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.352 "dma_device_type": 2 00:16:56.352 } 00:16:56.352 ], 00:16:56.352 "driver_specific": { 00:16:56.352 "passthru": { 00:16:56.352 "name": "pt4", 00:16:56.352 "base_bdev_name": "malloc4" 00:16:56.352 } 00:16:56.352 } 00:16:56.352 }' 00:16:56.352 10:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.352 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:56.613 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:56.872 [2024-06-10 10:12:18.516733] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:56.872 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=59bd94f9-3457-4098-b50c-52787b56c90d 00:16:56.872 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 59bd94f9-3457-4098-b50c-52787b56c90d ']' 00:16:56.872 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:56.872 [2024-06-10 10:12:18.709007] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:56.872 [2024-06-10 10:12:18.709018] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:56.872 [2024-06-10 10:12:18.709053] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:56.872 [2024-06-10 10:12:18.709100] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:56.872 [2024-06-10 10:12:18.709106] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2566fb0 name raid_bdev1, state offline 00:16:56.872 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.872 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:57.132 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:57.132 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:57.132 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.132 10:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:57.391 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.391 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:57.650 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.650 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:57.650 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.650 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:57.909 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:57.909 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:58.170 10:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:58.430 [2024-06-10 10:12:20.044362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:58.430 [2024-06-10 10:12:20.045430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:58.430 [2024-06-10 10:12:20.045462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:58.430 [2024-06-10 10:12:20.045488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:58.430 [2024-06-10 10:12:20.045527] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:58.430 [2024-06-10 10:12:20.045554] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:58.430 [2024-06-10 10:12:20.045568] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:58.430 [2024-06-10 10:12:20.045581] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:58.430 [2024-06-10 10:12:20.045591] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:58.430 [2024-06-10 10:12:20.045597] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b4a10 name raid_bdev1, state configuring 00:16:58.430 request: 00:16:58.430 { 00:16:58.430 "name": "raid_bdev1", 00:16:58.430 "raid_level": "raid0", 00:16:58.431 "base_bdevs": [ 00:16:58.431 "malloc1", 00:16:58.431 "malloc2", 00:16:58.431 "malloc3", 00:16:58.431 "malloc4" 00:16:58.431 ], 00:16:58.431 "superblock": false, 00:16:58.431 "strip_size_kb": 64, 00:16:58.431 "method": "bdev_raid_create", 00:16:58.431 "req_id": 1 00:16:58.431 } 00:16:58.431 Got JSON-RPC error response 00:16:58.431 response: 00:16:58.431 { 00:16:58.431 "code": -17, 00:16:58.431 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:58.431 } 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:58.431 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:58.691 [2024-06-10 10:12:20.433294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:58.691 [2024-06-10 10:12:20.433317] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.691 [2024-06-10 10:12:20.433328] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2567230 00:16:58.691 [2024-06-10 10:12:20.433335] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.691 [2024-06-10 10:12:20.434595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.691 [2024-06-10 10:12:20.434615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:58.691 [2024-06-10 10:12:20.434660] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:58.691 [2024-06-10 10:12:20.434677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.691 pt1 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.691 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.950 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.950 "name": "raid_bdev1", 00:16:58.950 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:16:58.950 "strip_size_kb": 64, 00:16:58.950 "state": "configuring", 00:16:58.950 "raid_level": "raid0", 00:16:58.950 "superblock": true, 00:16:58.950 "num_base_bdevs": 4, 00:16:58.950 "num_base_bdevs_discovered": 1, 00:16:58.950 "num_base_bdevs_operational": 4, 00:16:58.950 "base_bdevs_list": [ 00:16:58.950 { 00:16:58.950 "name": "pt1", 00:16:58.950 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.950 "is_configured": true, 00:16:58.950 "data_offset": 2048, 00:16:58.950 "data_size": 63488 00:16:58.950 }, 00:16:58.950 { 00:16:58.950 "name": null, 00:16:58.950 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.950 "is_configured": false, 00:16:58.950 "data_offset": 2048, 00:16:58.950 "data_size": 63488 00:16:58.950 }, 00:16:58.950 { 00:16:58.950 "name": null, 00:16:58.950 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.950 "is_configured": false, 00:16:58.950 "data_offset": 2048, 00:16:58.950 "data_size": 63488 00:16:58.950 }, 00:16:58.950 { 00:16:58.951 "name": null, 00:16:58.951 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:58.951 "is_configured": false, 00:16:58.951 "data_offset": 2048, 00:16:58.951 "data_size": 63488 00:16:58.951 } 00:16:58.951 ] 00:16:58.951 }' 00:16:58.951 10:12:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.951 10:12:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.521 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:59.521 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:59.521 [2024-06-10 10:12:21.311518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:59.521 [2024-06-10 10:12:21.311544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:59.521 [2024-06-10 10:12:21.311553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2561670 00:16:59.521 [2024-06-10 10:12:21.311559] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:59.521 [2024-06-10 10:12:21.311813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:59.521 [2024-06-10 10:12:21.311830] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:59.521 [2024-06-10 10:12:21.311869] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:59.521 [2024-06-10 10:12:21.311879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:59.521 pt2 00:16:59.521 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:59.781 [2024-06-10 10:12:21.491978] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.781 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.041 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.041 "name": "raid_bdev1", 00:17:00.041 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:17:00.041 "strip_size_kb": 64, 00:17:00.041 "state": "configuring", 00:17:00.041 "raid_level": "raid0", 00:17:00.041 "superblock": true, 00:17:00.041 "num_base_bdevs": 4, 00:17:00.041 "num_base_bdevs_discovered": 1, 00:17:00.041 "num_base_bdevs_operational": 4, 00:17:00.041 "base_bdevs_list": [ 00:17:00.041 { 00:17:00.041 "name": "pt1", 00:17:00.041 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.041 "is_configured": true, 00:17:00.041 "data_offset": 2048, 00:17:00.041 "data_size": 63488 00:17:00.041 }, 00:17:00.041 { 00:17:00.041 "name": null, 00:17:00.041 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.041 "is_configured": false, 00:17:00.041 "data_offset": 2048, 00:17:00.041 "data_size": 63488 00:17:00.041 }, 00:17:00.041 { 00:17:00.041 "name": null, 00:17:00.041 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.041 "is_configured": false, 00:17:00.041 "data_offset": 2048, 00:17:00.041 "data_size": 63488 00:17:00.041 }, 00:17:00.041 { 00:17:00.041 "name": null, 00:17:00.041 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:00.041 "is_configured": false, 00:17:00.041 "data_offset": 2048, 00:17:00.041 "data_size": 63488 00:17:00.041 } 00:17:00.041 ] 00:17:00.041 }' 00:17:00.041 10:12:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.041 10:12:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:00.610 [2024-06-10 10:12:22.434358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:00.610 [2024-06-10 10:12:22.434389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.610 [2024-06-10 10:12:22.434399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2566d10 00:17:00.610 [2024-06-10 10:12:22.434406] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.610 [2024-06-10 10:12:22.434673] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.610 [2024-06-10 10:12:22.434682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:00.610 [2024-06-10 10:12:22.434723] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:00.610 [2024-06-10 10:12:22.434735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:00.610 pt2 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.610 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:00.869 [2024-06-10 10:12:22.622843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:00.869 [2024-06-10 10:12:22.622865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.869 [2024-06-10 10:12:22.622873] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2564840 00:17:00.869 [2024-06-10 10:12:22.622879] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.869 [2024-06-10 10:12:22.623110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.869 [2024-06-10 10:12:22.623119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:00.869 [2024-06-10 10:12:22.623153] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:00.869 [2024-06-10 10:12:22.623163] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:00.869 pt3 00:17:00.869 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.869 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.869 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:01.128 [2024-06-10 10:12:22.811309] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:01.128 [2024-06-10 10:12:22.811327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.128 [2024-06-10 10:12:22.811335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x255fd20 00:17:01.128 [2024-06-10 10:12:22.811340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.128 [2024-06-10 10:12:22.811554] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.128 [2024-06-10 10:12:22.811564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:01.128 [2024-06-10 10:12:22.811596] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:01.128 [2024-06-10 10:12:22.811606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:01.128 [2024-06-10 10:12:22.811696] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25657d0 00:17:01.128 [2024-06-10 10:12:22.811701] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:01.128 [2024-06-10 10:12:22.811839] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2557320 00:17:01.128 [2024-06-10 10:12:22.811938] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25657d0 00:17:01.128 [2024-06-10 10:12:22.811943] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25657d0 00:17:01.128 [2024-06-10 10:12:22.812014] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:01.128 pt4 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.128 10:12:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:01.388 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.388 "name": "raid_bdev1", 00:17:01.388 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:17:01.388 "strip_size_kb": 64, 00:17:01.388 "state": "online", 00:17:01.388 "raid_level": "raid0", 00:17:01.388 "superblock": true, 00:17:01.388 "num_base_bdevs": 4, 00:17:01.388 "num_base_bdevs_discovered": 4, 00:17:01.388 "num_base_bdevs_operational": 4, 00:17:01.388 "base_bdevs_list": [ 00:17:01.388 { 00:17:01.388 "name": "pt1", 00:17:01.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.388 "is_configured": true, 00:17:01.388 "data_offset": 2048, 00:17:01.388 "data_size": 63488 00:17:01.388 }, 00:17:01.388 { 00:17:01.388 "name": "pt2", 00:17:01.388 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.388 "is_configured": true, 00:17:01.388 "data_offset": 2048, 00:17:01.388 "data_size": 63488 00:17:01.388 }, 00:17:01.388 { 00:17:01.388 "name": "pt3", 00:17:01.388 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.388 "is_configured": true, 00:17:01.388 "data_offset": 2048, 00:17:01.388 "data_size": 63488 00:17:01.388 }, 00:17:01.388 { 00:17:01.388 "name": "pt4", 00:17:01.388 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:01.388 "is_configured": true, 00:17:01.388 "data_offset": 2048, 00:17:01.388 "data_size": 63488 00:17:01.388 } 00:17:01.388 ] 00:17:01.388 }' 00:17:01.388 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.388 10:12:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:01.958 [2024-06-10 10:12:23.745905] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:01.958 "name": "raid_bdev1", 00:17:01.958 "aliases": [ 00:17:01.958 "59bd94f9-3457-4098-b50c-52787b56c90d" 00:17:01.958 ], 00:17:01.958 "product_name": "Raid Volume", 00:17:01.958 "block_size": 512, 00:17:01.958 "num_blocks": 253952, 00:17:01.958 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:17:01.958 "assigned_rate_limits": { 00:17:01.958 "rw_ios_per_sec": 0, 00:17:01.958 "rw_mbytes_per_sec": 0, 00:17:01.958 "r_mbytes_per_sec": 0, 00:17:01.958 "w_mbytes_per_sec": 0 00:17:01.958 }, 00:17:01.958 "claimed": false, 00:17:01.958 "zoned": false, 00:17:01.958 "supported_io_types": { 00:17:01.958 "read": true, 00:17:01.958 "write": true, 00:17:01.958 "unmap": true, 00:17:01.958 "write_zeroes": true, 00:17:01.958 "flush": true, 00:17:01.958 "reset": true, 00:17:01.958 "compare": false, 00:17:01.958 "compare_and_write": false, 00:17:01.958 "abort": false, 00:17:01.958 "nvme_admin": false, 00:17:01.958 "nvme_io": false 00:17:01.958 }, 00:17:01.958 "memory_domains": [ 00:17:01.958 { 00:17:01.958 "dma_device_id": "system", 00:17:01.958 "dma_device_type": 1 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.958 "dma_device_type": 2 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "system", 00:17:01.958 "dma_device_type": 1 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.958 "dma_device_type": 2 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "system", 00:17:01.958 "dma_device_type": 1 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.958 "dma_device_type": 2 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "system", 00:17:01.958 "dma_device_type": 1 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.958 "dma_device_type": 2 00:17:01.958 } 00:17:01.958 ], 00:17:01.958 "driver_specific": { 00:17:01.958 "raid": { 00:17:01.958 "uuid": "59bd94f9-3457-4098-b50c-52787b56c90d", 00:17:01.958 "strip_size_kb": 64, 00:17:01.958 "state": "online", 00:17:01.958 "raid_level": "raid0", 00:17:01.958 "superblock": true, 00:17:01.958 "num_base_bdevs": 4, 00:17:01.958 "num_base_bdevs_discovered": 4, 00:17:01.958 "num_base_bdevs_operational": 4, 00:17:01.958 "base_bdevs_list": [ 00:17:01.958 { 00:17:01.958 "name": "pt1", 00:17:01.958 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.958 "is_configured": true, 00:17:01.958 "data_offset": 2048, 00:17:01.958 "data_size": 63488 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "name": "pt2", 00:17:01.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.958 "is_configured": true, 00:17:01.958 "data_offset": 2048, 00:17:01.958 "data_size": 63488 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "name": "pt3", 00:17:01.958 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.958 "is_configured": true, 00:17:01.958 "data_offset": 2048, 00:17:01.958 "data_size": 63488 00:17:01.958 }, 00:17:01.958 { 00:17:01.958 "name": "pt4", 00:17:01.958 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:01.958 "is_configured": true, 00:17:01.958 "data_offset": 2048, 00:17:01.958 "data_size": 63488 00:17:01.958 } 00:17:01.958 ] 00:17:01.958 } 00:17:01.958 } 00:17:01.958 }' 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:01.958 pt2 00:17:01.958 pt3 00:17:01.958 pt4' 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:01.958 10:12:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.218 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.218 "name": "pt1", 00:17:02.218 "aliases": [ 00:17:02.218 "00000000-0000-0000-0000-000000000001" 00:17:02.218 ], 00:17:02.218 "product_name": "passthru", 00:17:02.218 "block_size": 512, 00:17:02.218 "num_blocks": 65536, 00:17:02.218 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.218 "assigned_rate_limits": { 00:17:02.218 "rw_ios_per_sec": 0, 00:17:02.218 "rw_mbytes_per_sec": 0, 00:17:02.218 "r_mbytes_per_sec": 0, 00:17:02.218 "w_mbytes_per_sec": 0 00:17:02.218 }, 00:17:02.218 "claimed": true, 00:17:02.218 "claim_type": "exclusive_write", 00:17:02.218 "zoned": false, 00:17:02.218 "supported_io_types": { 00:17:02.218 "read": true, 00:17:02.218 "write": true, 00:17:02.218 "unmap": true, 00:17:02.218 "write_zeroes": true, 00:17:02.218 "flush": true, 00:17:02.218 "reset": true, 00:17:02.218 "compare": false, 00:17:02.218 "compare_and_write": false, 00:17:02.218 "abort": true, 00:17:02.218 "nvme_admin": false, 00:17:02.218 "nvme_io": false 00:17:02.218 }, 00:17:02.218 "memory_domains": [ 00:17:02.218 { 00:17:02.218 "dma_device_id": "system", 00:17:02.218 "dma_device_type": 1 00:17:02.218 }, 00:17:02.218 { 00:17:02.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.218 "dma_device_type": 2 00:17:02.218 } 00:17:02.218 ], 00:17:02.218 "driver_specific": { 00:17:02.218 "passthru": { 00:17:02.218 "name": "pt1", 00:17:02.218 "base_bdev_name": "malloc1" 00:17:02.218 } 00:17:02.218 } 00:17:02.218 }' 00:17:02.218 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.218 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:02.477 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.737 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.737 "name": "pt2", 00:17:02.737 "aliases": [ 00:17:02.737 "00000000-0000-0000-0000-000000000002" 00:17:02.737 ], 00:17:02.737 "product_name": "passthru", 00:17:02.737 "block_size": 512, 00:17:02.737 "num_blocks": 65536, 00:17:02.737 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.737 "assigned_rate_limits": { 00:17:02.737 "rw_ios_per_sec": 0, 00:17:02.737 "rw_mbytes_per_sec": 0, 00:17:02.737 "r_mbytes_per_sec": 0, 00:17:02.737 "w_mbytes_per_sec": 0 00:17:02.737 }, 00:17:02.737 "claimed": true, 00:17:02.737 "claim_type": "exclusive_write", 00:17:02.737 "zoned": false, 00:17:02.737 "supported_io_types": { 00:17:02.737 "read": true, 00:17:02.737 "write": true, 00:17:02.737 "unmap": true, 00:17:02.737 "write_zeroes": true, 00:17:02.737 "flush": true, 00:17:02.737 "reset": true, 00:17:02.737 "compare": false, 00:17:02.737 "compare_and_write": false, 00:17:02.737 "abort": true, 00:17:02.737 "nvme_admin": false, 00:17:02.737 "nvme_io": false 00:17:02.737 }, 00:17:02.737 "memory_domains": [ 00:17:02.737 { 00:17:02.737 "dma_device_id": "system", 00:17:02.737 "dma_device_type": 1 00:17:02.737 }, 00:17:02.737 { 00:17:02.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.737 "dma_device_type": 2 00:17:02.737 } 00:17:02.737 ], 00:17:02.737 "driver_specific": { 00:17:02.737 "passthru": { 00:17:02.737 "name": "pt2", 00:17:02.737 "base_bdev_name": "malloc2" 00:17:02.737 } 00:17:02.737 } 00:17:02.737 }' 00:17:02.737 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.737 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.997 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.257 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.257 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.257 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:03.257 10:12:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.257 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.257 "name": "pt3", 00:17:03.257 "aliases": [ 00:17:03.257 "00000000-0000-0000-0000-000000000003" 00:17:03.257 ], 00:17:03.257 "product_name": "passthru", 00:17:03.257 "block_size": 512, 00:17:03.257 "num_blocks": 65536, 00:17:03.257 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.257 "assigned_rate_limits": { 00:17:03.257 "rw_ios_per_sec": 0, 00:17:03.257 "rw_mbytes_per_sec": 0, 00:17:03.257 "r_mbytes_per_sec": 0, 00:17:03.257 "w_mbytes_per_sec": 0 00:17:03.257 }, 00:17:03.257 "claimed": true, 00:17:03.257 "claim_type": "exclusive_write", 00:17:03.257 "zoned": false, 00:17:03.257 "supported_io_types": { 00:17:03.257 "read": true, 00:17:03.257 "write": true, 00:17:03.257 "unmap": true, 00:17:03.257 "write_zeroes": true, 00:17:03.257 "flush": true, 00:17:03.257 "reset": true, 00:17:03.257 "compare": false, 00:17:03.257 "compare_and_write": false, 00:17:03.257 "abort": true, 00:17:03.257 "nvme_admin": false, 00:17:03.257 "nvme_io": false 00:17:03.257 }, 00:17:03.257 "memory_domains": [ 00:17:03.257 { 00:17:03.257 "dma_device_id": "system", 00:17:03.257 "dma_device_type": 1 00:17:03.257 }, 00:17:03.257 { 00:17:03.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.257 "dma_device_type": 2 00:17:03.257 } 00:17:03.257 ], 00:17:03.257 "driver_specific": { 00:17:03.257 "passthru": { 00:17:03.257 "name": "pt3", 00:17:03.257 "base_bdev_name": "malloc3" 00:17:03.257 } 00:17:03.257 } 00:17:03.257 }' 00:17:03.257 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.518 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.780 "name": "pt4", 00:17:03.780 "aliases": [ 00:17:03.780 "00000000-0000-0000-0000-000000000004" 00:17:03.780 ], 00:17:03.780 "product_name": "passthru", 00:17:03.780 "block_size": 512, 00:17:03.780 "num_blocks": 65536, 00:17:03.780 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:03.780 "assigned_rate_limits": { 00:17:03.780 "rw_ios_per_sec": 0, 00:17:03.780 "rw_mbytes_per_sec": 0, 00:17:03.780 "r_mbytes_per_sec": 0, 00:17:03.780 "w_mbytes_per_sec": 0 00:17:03.780 }, 00:17:03.780 "claimed": true, 00:17:03.780 "claim_type": "exclusive_write", 00:17:03.780 "zoned": false, 00:17:03.780 "supported_io_types": { 00:17:03.780 "read": true, 00:17:03.780 "write": true, 00:17:03.780 "unmap": true, 00:17:03.780 "write_zeroes": true, 00:17:03.780 "flush": true, 00:17:03.780 "reset": true, 00:17:03.780 "compare": false, 00:17:03.780 "compare_and_write": false, 00:17:03.780 "abort": true, 00:17:03.780 "nvme_admin": false, 00:17:03.780 "nvme_io": false 00:17:03.780 }, 00:17:03.780 "memory_domains": [ 00:17:03.780 { 00:17:03.780 "dma_device_id": "system", 00:17:03.780 "dma_device_type": 1 00:17:03.780 }, 00:17:03.780 { 00:17:03.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.780 "dma_device_type": 2 00:17:03.780 } 00:17:03.780 ], 00:17:03.780 "driver_specific": { 00:17:03.780 "passthru": { 00:17:03.780 "name": "pt4", 00:17:03.780 "base_bdev_name": "malloc4" 00:17:03.780 } 00:17:03.780 } 00:17:03.780 }' 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.780 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.040 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.040 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.040 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.040 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.040 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.041 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.041 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.041 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.301 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.301 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.301 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:04.301 10:12:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:04.301 [2024-06-10 10:12:26.131939] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 59bd94f9-3457-4098-b50c-52787b56c90d '!=' 59bd94f9-3457-4098-b50c-52787b56c90d ']' 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1032112 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1032112 ']' 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1032112 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:04.301 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1032112 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1032112' 00:17:04.561 killing process with pid 1032112 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1032112 00:17:04.561 [2024-06-10 10:12:26.201149] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:04.561 [2024-06-10 10:12:26.201193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:04.561 [2024-06-10 10:12:26.201238] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:04.561 [2024-06-10 10:12:26.201243] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25657d0 name raid_bdev1, state offline 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1032112 00:17:04.561 [2024-06-10 10:12:26.222001] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:04.561 00:17:04.561 real 0m13.812s 00:17:04.561 user 0m25.429s 00:17:04.561 sys 0m2.015s 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.561 10:12:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.561 ************************************ 00:17:04.561 END TEST raid_superblock_test 00:17:04.561 ************************************ 00:17:04.561 10:12:26 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:17:04.561 10:12:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:04.561 10:12:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:04.561 10:12:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:04.561 ************************************ 00:17:04.561 START TEST raid_read_error_test 00:17:04.561 ************************************ 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 read 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:04.561 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:04.562 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.s969k25OlP 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1034734 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1034734 /var/tmp/spdk-raid.sock 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1034734 ']' 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:04.822 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:04.822 10:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.822 [2024-06-10 10:12:26.488723] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:17:04.822 [2024-06-10 10:12:26.488774] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1034734 ] 00:17:04.822 [2024-06-10 10:12:26.578069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:04.822 [2024-06-10 10:12:26.643269] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:05.081 [2024-06-10 10:12:26.696582] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:05.081 [2024-06-10 10:12:26.696610] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:05.652 10:12:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:05.652 10:12:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:05.652 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:05.652 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:05.652 BaseBdev1_malloc 00:17:05.652 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:05.912 true 00:17:05.912 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:06.173 [2024-06-10 10:12:27.864157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:06.173 [2024-06-10 10:12:27.864188] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.173 [2024-06-10 10:12:27.864199] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1035d10 00:17:06.173 [2024-06-10 10:12:27.864206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.173 [2024-06-10 10:12:27.865554] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.173 [2024-06-10 10:12:27.865573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:06.173 BaseBdev1 00:17:06.173 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:06.173 10:12:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:06.433 BaseBdev2_malloc 00:17:06.433 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:06.433 true 00:17:06.433 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:06.694 [2024-06-10 10:12:28.419481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:06.694 [2024-06-10 10:12:28.419509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.694 [2024-06-10 10:12:28.419520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103a710 00:17:06.694 [2024-06-10 10:12:28.419526] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.694 [2024-06-10 10:12:28.420709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.694 [2024-06-10 10:12:28.420728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:06.694 BaseBdev2 00:17:06.694 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:06.694 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:06.955 BaseBdev3_malloc 00:17:06.955 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:06.955 true 00:17:06.955 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:07.216 [2024-06-10 10:12:28.974771] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:07.216 [2024-06-10 10:12:28.974797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.216 [2024-06-10 10:12:28.974806] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103b340 00:17:07.216 [2024-06-10 10:12:28.974813] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.216 [2024-06-10 10:12:28.975988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.216 [2024-06-10 10:12:28.976006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:07.216 BaseBdev3 00:17:07.216 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:07.216 10:12:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:07.477 BaseBdev4_malloc 00:17:07.477 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:07.738 true 00:17:07.738 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:07.738 [2024-06-10 10:12:29.534019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:07.738 [2024-06-10 10:12:29.534046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.738 [2024-06-10 10:12:29.534058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1034aa0 00:17:07.738 [2024-06-10 10:12:29.534064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.738 [2024-06-10 10:12:29.535254] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.738 [2024-06-10 10:12:29.535273] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:07.738 BaseBdev4 00:17:07.738 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:08.029 [2024-06-10 10:12:29.710490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:08.029 [2024-06-10 10:12:29.711488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:08.029 [2024-06-10 10:12:29.711540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:08.029 [2024-06-10 10:12:29.711593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:08.029 [2024-06-10 10:12:29.711769] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103e1b0 00:17:08.029 [2024-06-10 10:12:29.711776] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:08.029 [2024-06-10 10:12:29.711923] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe850d0 00:17:08.029 [2024-06-10 10:12:29.712038] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103e1b0 00:17:08.029 [2024-06-10 10:12:29.712044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103e1b0 00:17:08.029 [2024-06-10 10:12:29.712118] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.029 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:08.293 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.293 "name": "raid_bdev1", 00:17:08.293 "uuid": "e0f8909c-5012-408a-8fd0-b800d7a3ec08", 00:17:08.293 "strip_size_kb": 64, 00:17:08.293 "state": "online", 00:17:08.293 "raid_level": "raid0", 00:17:08.293 "superblock": true, 00:17:08.293 "num_base_bdevs": 4, 00:17:08.293 "num_base_bdevs_discovered": 4, 00:17:08.293 "num_base_bdevs_operational": 4, 00:17:08.293 "base_bdevs_list": [ 00:17:08.293 { 00:17:08.293 "name": "BaseBdev1", 00:17:08.293 "uuid": "ddaf7822-0ef8-5317-9a22-70f5e648bbfc", 00:17:08.293 "is_configured": true, 00:17:08.293 "data_offset": 2048, 00:17:08.293 "data_size": 63488 00:17:08.293 }, 00:17:08.293 { 00:17:08.293 "name": "BaseBdev2", 00:17:08.293 "uuid": "5067c090-d007-52e6-bf16-fd1356f42dfe", 00:17:08.293 "is_configured": true, 00:17:08.293 "data_offset": 2048, 00:17:08.293 "data_size": 63488 00:17:08.293 }, 00:17:08.293 { 00:17:08.293 "name": "BaseBdev3", 00:17:08.293 "uuid": "16147f7c-6732-5216-834a-489ad327dc4d", 00:17:08.293 "is_configured": true, 00:17:08.293 "data_offset": 2048, 00:17:08.293 "data_size": 63488 00:17:08.293 }, 00:17:08.293 { 00:17:08.293 "name": "BaseBdev4", 00:17:08.293 "uuid": "6c85d87b-f8bc-5541-8a70-f48e45205f16", 00:17:08.293 "is_configured": true, 00:17:08.293 "data_offset": 2048, 00:17:08.293 "data_size": 63488 00:17:08.293 } 00:17:08.293 ] 00:17:08.293 }' 00:17:08.293 10:12:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.293 10:12:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.864 10:12:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:08.864 10:12:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:08.864 [2024-06-10 10:12:30.540764] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1037060 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.804 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:10.065 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.065 "name": "raid_bdev1", 00:17:10.065 "uuid": "e0f8909c-5012-408a-8fd0-b800d7a3ec08", 00:17:10.065 "strip_size_kb": 64, 00:17:10.065 "state": "online", 00:17:10.065 "raid_level": "raid0", 00:17:10.065 "superblock": true, 00:17:10.065 "num_base_bdevs": 4, 00:17:10.065 "num_base_bdevs_discovered": 4, 00:17:10.065 "num_base_bdevs_operational": 4, 00:17:10.065 "base_bdevs_list": [ 00:17:10.065 { 00:17:10.065 "name": "BaseBdev1", 00:17:10.065 "uuid": "ddaf7822-0ef8-5317-9a22-70f5e648bbfc", 00:17:10.065 "is_configured": true, 00:17:10.065 "data_offset": 2048, 00:17:10.065 "data_size": 63488 00:17:10.065 }, 00:17:10.065 { 00:17:10.065 "name": "BaseBdev2", 00:17:10.065 "uuid": "5067c090-d007-52e6-bf16-fd1356f42dfe", 00:17:10.065 "is_configured": true, 00:17:10.065 "data_offset": 2048, 00:17:10.065 "data_size": 63488 00:17:10.065 }, 00:17:10.065 { 00:17:10.065 "name": "BaseBdev3", 00:17:10.065 "uuid": "16147f7c-6732-5216-834a-489ad327dc4d", 00:17:10.065 "is_configured": true, 00:17:10.065 "data_offset": 2048, 00:17:10.065 "data_size": 63488 00:17:10.065 }, 00:17:10.065 { 00:17:10.065 "name": "BaseBdev4", 00:17:10.065 "uuid": "6c85d87b-f8bc-5541-8a70-f48e45205f16", 00:17:10.065 "is_configured": true, 00:17:10.065 "data_offset": 2048, 00:17:10.065 "data_size": 63488 00:17:10.065 } 00:17:10.065 ] 00:17:10.065 }' 00:17:10.065 10:12:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.065 10:12:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.635 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:10.894 [2024-06-10 10:12:32.575703] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:10.894 [2024-06-10 10:12:32.575732] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:10.894 [2024-06-10 10:12:32.578320] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.894 [2024-06-10 10:12:32.578346] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.895 [2024-06-10 10:12:32.578375] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.895 [2024-06-10 10:12:32.578381] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103e1b0 name raid_bdev1, state offline 00:17:10.895 0 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1034734 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1034734 ']' 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1034734 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1034734 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1034734' 00:17:10.895 killing process with pid 1034734 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1034734 00:17:10.895 [2024-06-10 10:12:32.644620] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:10.895 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1034734 00:17:10.895 [2024-06-10 10:12:32.661876] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.s969k25OlP 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:11.155 00:17:11.155 real 0m6.376s 00:17:11.155 user 0m10.246s 00:17:11.155 sys 0m0.885s 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:11.155 10:12:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.155 ************************************ 00:17:11.155 END TEST raid_read_error_test 00:17:11.155 ************************************ 00:17:11.155 10:12:32 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:17:11.155 10:12:32 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:11.155 10:12:32 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:11.155 10:12:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:11.155 ************************************ 00:17:11.155 START TEST raid_write_error_test 00:17:11.155 ************************************ 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 write 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ObCtlJT2s0 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1035933 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1035933 /var/tmp/spdk-raid.sock 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1035933 ']' 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:11.155 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:11.155 10:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.155 [2024-06-10 10:12:32.942903] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:17:11.155 [2024-06-10 10:12:32.942949] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1035933 ] 00:17:11.415 [2024-06-10 10:12:33.030148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:11.415 [2024-06-10 10:12:33.093904] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:11.415 [2024-06-10 10:12:33.133928] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:11.415 [2024-06-10 10:12:33.133952] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:11.985 10:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:11.985 10:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:11.985 10:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:11.985 10:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:12.246 BaseBdev1_malloc 00:17:12.246 10:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:12.505 true 00:17:12.506 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:12.506 [2024-06-10 10:12:34.320330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:12.506 [2024-06-10 10:12:34.320359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.506 [2024-06-10 10:12:34.320374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ded10 00:17:12.506 [2024-06-10 10:12:34.320380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.506 [2024-06-10 10:12:34.321720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.506 [2024-06-10 10:12:34.321740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:12.506 BaseBdev1 00:17:12.506 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:12.506 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:12.766 BaseBdev2_malloc 00:17:12.766 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:13.026 true 00:17:13.026 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:13.026 [2024-06-10 10:12:34.891699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:13.026 [2024-06-10 10:12:34.891728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.026 [2024-06-10 10:12:34.891739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14e3710 00:17:13.026 [2024-06-10 10:12:34.891745] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.288 [2024-06-10 10:12:34.892933] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.288 [2024-06-10 10:12:34.892952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:13.288 BaseBdev2 00:17:13.288 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:13.288 10:12:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:13.288 BaseBdev3_malloc 00:17:13.288 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:13.548 true 00:17:13.548 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:13.809 [2024-06-10 10:12:35.447011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:13.809 [2024-06-10 10:12:35.447038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.809 [2024-06-10 10:12:35.447049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14e4340 00:17:13.809 [2024-06-10 10:12:35.447055] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.809 [2024-06-10 10:12:35.448238] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.809 [2024-06-10 10:12:35.448256] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:13.809 BaseBdev3 00:17:13.809 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:13.809 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:13.809 BaseBdev4_malloc 00:17:13.809 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:14.070 true 00:17:14.070 10:12:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:14.330 [2024-06-10 10:12:36.014358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:14.330 [2024-06-10 10:12:36.014391] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:14.330 [2024-06-10 10:12:36.014404] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ddaa0 00:17:14.330 [2024-06-10 10:12:36.014410] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:14.331 [2024-06-10 10:12:36.015599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:14.331 [2024-06-10 10:12:36.015619] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:14.331 BaseBdev4 00:17:14.331 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:14.331 [2024-06-10 10:12:36.190829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:14.331 [2024-06-10 10:12:36.191814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:14.331 [2024-06-10 10:12:36.191873] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:14.331 [2024-06-10 10:12:36.191921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:14.331 [2024-06-10 10:12:36.192099] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14e71b0 00:17:14.331 [2024-06-10 10:12:36.192106] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:14.331 [2024-06-10 10:12:36.192246] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132e0d0 00:17:14.331 [2024-06-10 10:12:36.192361] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14e71b0 00:17:14.331 [2024-06-10 10:12:36.192366] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14e71b0 00:17:14.331 [2024-06-10 10:12:36.192440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.592 "name": "raid_bdev1", 00:17:14.592 "uuid": "6de17b98-68d6-4546-915a-3ff58440f209", 00:17:14.592 "strip_size_kb": 64, 00:17:14.592 "state": "online", 00:17:14.592 "raid_level": "raid0", 00:17:14.592 "superblock": true, 00:17:14.592 "num_base_bdevs": 4, 00:17:14.592 "num_base_bdevs_discovered": 4, 00:17:14.592 "num_base_bdevs_operational": 4, 00:17:14.592 "base_bdevs_list": [ 00:17:14.592 { 00:17:14.592 "name": "BaseBdev1", 00:17:14.592 "uuid": "3524f425-649b-5ed8-99d3-763cef1c7826", 00:17:14.592 "is_configured": true, 00:17:14.592 "data_offset": 2048, 00:17:14.592 "data_size": 63488 00:17:14.592 }, 00:17:14.592 { 00:17:14.592 "name": "BaseBdev2", 00:17:14.592 "uuid": "26788088-1e9e-5b75-a324-e7944c545461", 00:17:14.592 "is_configured": true, 00:17:14.592 "data_offset": 2048, 00:17:14.592 "data_size": 63488 00:17:14.592 }, 00:17:14.592 { 00:17:14.592 "name": "BaseBdev3", 00:17:14.592 "uuid": "f673329b-89a1-56fb-8856-ca6e3a4614cb", 00:17:14.592 "is_configured": true, 00:17:14.592 "data_offset": 2048, 00:17:14.592 "data_size": 63488 00:17:14.592 }, 00:17:14.592 { 00:17:14.592 "name": "BaseBdev4", 00:17:14.592 "uuid": "85eab451-480a-5849-9e55-4232dac4df6a", 00:17:14.592 "is_configured": true, 00:17:14.592 "data_offset": 2048, 00:17:14.592 "data_size": 63488 00:17:14.592 } 00:17:14.592 ] 00:17:14.592 }' 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.592 10:12:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.163 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:15.163 10:12:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:15.163 [2024-06-10 10:12:37.001051] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14e0060 00:17:16.102 10:12:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.361 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.621 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.621 "name": "raid_bdev1", 00:17:16.621 "uuid": "6de17b98-68d6-4546-915a-3ff58440f209", 00:17:16.621 "strip_size_kb": 64, 00:17:16.621 "state": "online", 00:17:16.621 "raid_level": "raid0", 00:17:16.621 "superblock": true, 00:17:16.621 "num_base_bdevs": 4, 00:17:16.621 "num_base_bdevs_discovered": 4, 00:17:16.621 "num_base_bdevs_operational": 4, 00:17:16.621 "base_bdevs_list": [ 00:17:16.621 { 00:17:16.621 "name": "BaseBdev1", 00:17:16.621 "uuid": "3524f425-649b-5ed8-99d3-763cef1c7826", 00:17:16.621 "is_configured": true, 00:17:16.621 "data_offset": 2048, 00:17:16.621 "data_size": 63488 00:17:16.621 }, 00:17:16.621 { 00:17:16.621 "name": "BaseBdev2", 00:17:16.621 "uuid": "26788088-1e9e-5b75-a324-e7944c545461", 00:17:16.621 "is_configured": true, 00:17:16.621 "data_offset": 2048, 00:17:16.621 "data_size": 63488 00:17:16.621 }, 00:17:16.621 { 00:17:16.621 "name": "BaseBdev3", 00:17:16.621 "uuid": "f673329b-89a1-56fb-8856-ca6e3a4614cb", 00:17:16.621 "is_configured": true, 00:17:16.621 "data_offset": 2048, 00:17:16.621 "data_size": 63488 00:17:16.621 }, 00:17:16.621 { 00:17:16.621 "name": "BaseBdev4", 00:17:16.621 "uuid": "85eab451-480a-5849-9e55-4232dac4df6a", 00:17:16.621 "is_configured": true, 00:17:16.621 "data_offset": 2048, 00:17:16.621 "data_size": 63488 00:17:16.621 } 00:17:16.621 ] 00:17:16.621 }' 00:17:16.621 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.621 10:12:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.192 10:12:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:17.192 [2024-06-10 10:12:39.042963] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:17.192 [2024-06-10 10:12:39.042997] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:17.192 [2024-06-10 10:12:39.045578] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:17.192 [2024-06-10 10:12:39.045605] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:17.192 [2024-06-10 10:12:39.045633] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:17.192 [2024-06-10 10:12:39.045639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14e71b0 name raid_bdev1, state offline 00:17:17.192 0 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1035933 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1035933 ']' 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1035933 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1035933 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1035933' 00:17:17.452 killing process with pid 1035933 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1035933 00:17:17.452 [2024-06-10 10:12:39.110788] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1035933 00:17:17.452 [2024-06-10 10:12:39.127702] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ObCtlJT2s0 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:17.452 00:17:17.452 real 0m6.386s 00:17:17.452 user 0m10.280s 00:17:17.452 sys 0m0.882s 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:17.452 10:12:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.452 ************************************ 00:17:17.452 END TEST raid_write_error_test 00:17:17.452 ************************************ 00:17:17.452 10:12:39 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:17.452 10:12:39 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:17:17.452 10:12:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:17.452 10:12:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:17.452 10:12:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:17.713 ************************************ 00:17:17.713 START TEST raid_state_function_test 00:17:17.713 ************************************ 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 false 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1037087 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1037087' 00:17:17.713 Process raid pid: 1037087 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1037087 /var/tmp/spdk-raid.sock 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1037087 ']' 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:17.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.713 10:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:17.713 [2024-06-10 10:12:39.398940] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:17:17.713 [2024-06-10 10:12:39.399001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:17.713 [2024-06-10 10:12:39.492004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:17.713 [2024-06-10 10:12:39.555981] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:17.974 [2024-06-10 10:12:39.598865] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.974 [2024-06-10 10:12:39.598886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:18.544 [2024-06-10 10:12:40.389931] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:18.544 [2024-06-10 10:12:40.389964] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:18.544 [2024-06-10 10:12:40.389969] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:18.544 [2024-06-10 10:12:40.389975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:18.544 [2024-06-10 10:12:40.389982] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:18.544 [2024-06-10 10:12:40.389987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:18.544 [2024-06-10 10:12:40.389992] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:18.544 [2024-06-10 10:12:40.389997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.544 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.545 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.545 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.805 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.805 "name": "Existed_Raid", 00:17:18.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.805 "strip_size_kb": 64, 00:17:18.805 "state": "configuring", 00:17:18.805 "raid_level": "concat", 00:17:18.805 "superblock": false, 00:17:18.805 "num_base_bdevs": 4, 00:17:18.805 "num_base_bdevs_discovered": 0, 00:17:18.805 "num_base_bdevs_operational": 4, 00:17:18.805 "base_bdevs_list": [ 00:17:18.805 { 00:17:18.805 "name": "BaseBdev1", 00:17:18.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.805 "is_configured": false, 00:17:18.805 "data_offset": 0, 00:17:18.805 "data_size": 0 00:17:18.805 }, 00:17:18.805 { 00:17:18.805 "name": "BaseBdev2", 00:17:18.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.805 "is_configured": false, 00:17:18.805 "data_offset": 0, 00:17:18.805 "data_size": 0 00:17:18.805 }, 00:17:18.805 { 00:17:18.805 "name": "BaseBdev3", 00:17:18.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.805 "is_configured": false, 00:17:18.805 "data_offset": 0, 00:17:18.805 "data_size": 0 00:17:18.805 }, 00:17:18.805 { 00:17:18.805 "name": "BaseBdev4", 00:17:18.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.805 "is_configured": false, 00:17:18.805 "data_offset": 0, 00:17:18.805 "data_size": 0 00:17:18.805 } 00:17:18.805 ] 00:17:18.805 }' 00:17:18.805 10:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.805 10:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.375 10:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:19.635 [2024-06-10 10:12:41.300109] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:19.635 [2024-06-10 10:12:41.300125] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c6b20 name Existed_Raid, state configuring 00:17:19.635 10:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:19.635 [2024-06-10 10:12:41.488605] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:19.635 [2024-06-10 10:12:41.488625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:19.635 [2024-06-10 10:12:41.488630] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:19.635 [2024-06-10 10:12:41.488635] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:19.635 [2024-06-10 10:12:41.488640] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:19.635 [2024-06-10 10:12:41.488645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:19.635 [2024-06-10 10:12:41.488649] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:19.635 [2024-06-10 10:12:41.488655] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:19.895 [2024-06-10 10:12:41.683712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.895 BaseBdev1 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:19.895 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.155 10:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:20.423 [ 00:17:20.423 { 00:17:20.423 "name": "BaseBdev1", 00:17:20.423 "aliases": [ 00:17:20.423 "dc00fd4a-3661-417f-8447-22ad59ee56a2" 00:17:20.423 ], 00:17:20.423 "product_name": "Malloc disk", 00:17:20.423 "block_size": 512, 00:17:20.423 "num_blocks": 65536, 00:17:20.423 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:20.423 "assigned_rate_limits": { 00:17:20.423 "rw_ios_per_sec": 0, 00:17:20.423 "rw_mbytes_per_sec": 0, 00:17:20.423 "r_mbytes_per_sec": 0, 00:17:20.423 "w_mbytes_per_sec": 0 00:17:20.423 }, 00:17:20.423 "claimed": true, 00:17:20.423 "claim_type": "exclusive_write", 00:17:20.423 "zoned": false, 00:17:20.423 "supported_io_types": { 00:17:20.423 "read": true, 00:17:20.423 "write": true, 00:17:20.423 "unmap": true, 00:17:20.423 "write_zeroes": true, 00:17:20.423 "flush": true, 00:17:20.423 "reset": true, 00:17:20.423 "compare": false, 00:17:20.423 "compare_and_write": false, 00:17:20.423 "abort": true, 00:17:20.423 "nvme_admin": false, 00:17:20.423 "nvme_io": false 00:17:20.423 }, 00:17:20.423 "memory_domains": [ 00:17:20.423 { 00:17:20.423 "dma_device_id": "system", 00:17:20.423 "dma_device_type": 1 00:17:20.423 }, 00:17:20.423 { 00:17:20.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.423 "dma_device_type": 2 00:17:20.423 } 00:17:20.423 ], 00:17:20.423 "driver_specific": {} 00:17:20.423 } 00:17:20.423 ] 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.423 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.423 "name": "Existed_Raid", 00:17:20.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.423 "strip_size_kb": 64, 00:17:20.423 "state": "configuring", 00:17:20.423 "raid_level": "concat", 00:17:20.423 "superblock": false, 00:17:20.423 "num_base_bdevs": 4, 00:17:20.423 "num_base_bdevs_discovered": 1, 00:17:20.423 "num_base_bdevs_operational": 4, 00:17:20.423 "base_bdevs_list": [ 00:17:20.423 { 00:17:20.423 "name": "BaseBdev1", 00:17:20.423 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:20.423 "is_configured": true, 00:17:20.423 "data_offset": 0, 00:17:20.423 "data_size": 65536 00:17:20.423 }, 00:17:20.423 { 00:17:20.423 "name": "BaseBdev2", 00:17:20.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.423 "is_configured": false, 00:17:20.423 "data_offset": 0, 00:17:20.423 "data_size": 0 00:17:20.423 }, 00:17:20.423 { 00:17:20.423 "name": "BaseBdev3", 00:17:20.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.423 "is_configured": false, 00:17:20.423 "data_offset": 0, 00:17:20.423 "data_size": 0 00:17:20.423 }, 00:17:20.423 { 00:17:20.423 "name": "BaseBdev4", 00:17:20.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.423 "is_configured": false, 00:17:20.424 "data_offset": 0, 00:17:20.424 "data_size": 0 00:17:20.424 } 00:17:20.424 ] 00:17:20.424 }' 00:17:20.424 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.424 10:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.994 10:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:21.253 [2024-06-10 10:12:42.987065] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:21.253 [2024-06-10 10:12:42.987091] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c63b0 name Existed_Raid, state configuring 00:17:21.253 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:21.513 [2024-06-10 10:12:43.179580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:21.513 [2024-06-10 10:12:43.180719] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:21.513 [2024-06-10 10:12:43.180743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:21.513 [2024-06-10 10:12:43.180749] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:21.513 [2024-06-10 10:12:43.180755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:21.513 [2024-06-10 10:12:43.180766] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:21.513 [2024-06-10 10:12:43.180771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.513 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.514 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.773 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.773 "name": "Existed_Raid", 00:17:21.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.773 "strip_size_kb": 64, 00:17:21.773 "state": "configuring", 00:17:21.773 "raid_level": "concat", 00:17:21.773 "superblock": false, 00:17:21.773 "num_base_bdevs": 4, 00:17:21.773 "num_base_bdevs_discovered": 1, 00:17:21.773 "num_base_bdevs_operational": 4, 00:17:21.773 "base_bdevs_list": [ 00:17:21.773 { 00:17:21.773 "name": "BaseBdev1", 00:17:21.773 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:21.773 "is_configured": true, 00:17:21.773 "data_offset": 0, 00:17:21.773 "data_size": 65536 00:17:21.773 }, 00:17:21.773 { 00:17:21.773 "name": "BaseBdev2", 00:17:21.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.773 "is_configured": false, 00:17:21.773 "data_offset": 0, 00:17:21.773 "data_size": 0 00:17:21.773 }, 00:17:21.773 { 00:17:21.773 "name": "BaseBdev3", 00:17:21.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.773 "is_configured": false, 00:17:21.773 "data_offset": 0, 00:17:21.773 "data_size": 0 00:17:21.773 }, 00:17:21.773 { 00:17:21.773 "name": "BaseBdev4", 00:17:21.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.773 "is_configured": false, 00:17:21.773 "data_offset": 0, 00:17:21.773 "data_size": 0 00:17:21.773 } 00:17:21.773 ] 00:17:21.773 }' 00:17:21.773 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.773 10:12:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.343 10:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:22.343 [2024-06-10 10:12:44.094902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:22.343 BaseBdev2 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:22.343 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.603 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:22.603 [ 00:17:22.603 { 00:17:22.603 "name": "BaseBdev2", 00:17:22.603 "aliases": [ 00:17:22.603 "adb4ff25-f5c6-43f4-a392-a28d932dd00c" 00:17:22.603 ], 00:17:22.603 "product_name": "Malloc disk", 00:17:22.603 "block_size": 512, 00:17:22.603 "num_blocks": 65536, 00:17:22.603 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:22.603 "assigned_rate_limits": { 00:17:22.603 "rw_ios_per_sec": 0, 00:17:22.603 "rw_mbytes_per_sec": 0, 00:17:22.603 "r_mbytes_per_sec": 0, 00:17:22.603 "w_mbytes_per_sec": 0 00:17:22.603 }, 00:17:22.603 "claimed": true, 00:17:22.603 "claim_type": "exclusive_write", 00:17:22.603 "zoned": false, 00:17:22.603 "supported_io_types": { 00:17:22.603 "read": true, 00:17:22.603 "write": true, 00:17:22.603 "unmap": true, 00:17:22.603 "write_zeroes": true, 00:17:22.603 "flush": true, 00:17:22.603 "reset": true, 00:17:22.603 "compare": false, 00:17:22.603 "compare_and_write": false, 00:17:22.603 "abort": true, 00:17:22.603 "nvme_admin": false, 00:17:22.603 "nvme_io": false 00:17:22.603 }, 00:17:22.603 "memory_domains": [ 00:17:22.603 { 00:17:22.603 "dma_device_id": "system", 00:17:22.603 "dma_device_type": 1 00:17:22.603 }, 00:17:22.603 { 00:17:22.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.603 "dma_device_type": 2 00:17:22.603 } 00:17:22.603 ], 00:17:22.603 "driver_specific": {} 00:17:22.603 } 00:17:22.603 ] 00:17:22.863 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:22.863 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:22.863 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.864 "name": "Existed_Raid", 00:17:22.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.864 "strip_size_kb": 64, 00:17:22.864 "state": "configuring", 00:17:22.864 "raid_level": "concat", 00:17:22.864 "superblock": false, 00:17:22.864 "num_base_bdevs": 4, 00:17:22.864 "num_base_bdevs_discovered": 2, 00:17:22.864 "num_base_bdevs_operational": 4, 00:17:22.864 "base_bdevs_list": [ 00:17:22.864 { 00:17:22.864 "name": "BaseBdev1", 00:17:22.864 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:22.864 "is_configured": true, 00:17:22.864 "data_offset": 0, 00:17:22.864 "data_size": 65536 00:17:22.864 }, 00:17:22.864 { 00:17:22.864 "name": "BaseBdev2", 00:17:22.864 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:22.864 "is_configured": true, 00:17:22.864 "data_offset": 0, 00:17:22.864 "data_size": 65536 00:17:22.864 }, 00:17:22.864 { 00:17:22.864 "name": "BaseBdev3", 00:17:22.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.864 "is_configured": false, 00:17:22.864 "data_offset": 0, 00:17:22.864 "data_size": 0 00:17:22.864 }, 00:17:22.864 { 00:17:22.864 "name": "BaseBdev4", 00:17:22.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.864 "is_configured": false, 00:17:22.864 "data_offset": 0, 00:17:22.864 "data_size": 0 00:17:22.864 } 00:17:22.864 ] 00:17:22.864 }' 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.864 10:12:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.435 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:23.695 [2024-06-10 10:12:45.402982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:23.695 BaseBdev3 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:23.695 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.955 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:23.955 [ 00:17:23.955 { 00:17:23.955 "name": "BaseBdev3", 00:17:23.955 "aliases": [ 00:17:23.955 "fd4a0433-84b0-47c2-a29a-90479861f3ff" 00:17:23.955 ], 00:17:23.955 "product_name": "Malloc disk", 00:17:23.955 "block_size": 512, 00:17:23.955 "num_blocks": 65536, 00:17:23.955 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:23.955 "assigned_rate_limits": { 00:17:23.955 "rw_ios_per_sec": 0, 00:17:23.955 "rw_mbytes_per_sec": 0, 00:17:23.955 "r_mbytes_per_sec": 0, 00:17:23.955 "w_mbytes_per_sec": 0 00:17:23.955 }, 00:17:23.955 "claimed": true, 00:17:23.955 "claim_type": "exclusive_write", 00:17:23.955 "zoned": false, 00:17:23.955 "supported_io_types": { 00:17:23.955 "read": true, 00:17:23.955 "write": true, 00:17:23.955 "unmap": true, 00:17:23.955 "write_zeroes": true, 00:17:23.955 "flush": true, 00:17:23.955 "reset": true, 00:17:23.955 "compare": false, 00:17:23.955 "compare_and_write": false, 00:17:23.955 "abort": true, 00:17:23.955 "nvme_admin": false, 00:17:23.955 "nvme_io": false 00:17:23.955 }, 00:17:23.955 "memory_domains": [ 00:17:23.955 { 00:17:23.955 "dma_device_id": "system", 00:17:23.955 "dma_device_type": 1 00:17:23.955 }, 00:17:23.955 { 00:17:23.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.955 "dma_device_type": 2 00:17:23.955 } 00:17:23.955 ], 00:17:23.955 "driver_specific": {} 00:17:23.955 } 00:17:23.955 ] 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.956 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.215 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.215 "name": "Existed_Raid", 00:17:24.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.215 "strip_size_kb": 64, 00:17:24.215 "state": "configuring", 00:17:24.215 "raid_level": "concat", 00:17:24.215 "superblock": false, 00:17:24.215 "num_base_bdevs": 4, 00:17:24.215 "num_base_bdevs_discovered": 3, 00:17:24.215 "num_base_bdevs_operational": 4, 00:17:24.215 "base_bdevs_list": [ 00:17:24.215 { 00:17:24.215 "name": "BaseBdev1", 00:17:24.215 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:24.215 "is_configured": true, 00:17:24.215 "data_offset": 0, 00:17:24.215 "data_size": 65536 00:17:24.215 }, 00:17:24.215 { 00:17:24.215 "name": "BaseBdev2", 00:17:24.215 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:24.215 "is_configured": true, 00:17:24.215 "data_offset": 0, 00:17:24.215 "data_size": 65536 00:17:24.215 }, 00:17:24.215 { 00:17:24.215 "name": "BaseBdev3", 00:17:24.215 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:24.215 "is_configured": true, 00:17:24.215 "data_offset": 0, 00:17:24.215 "data_size": 65536 00:17:24.215 }, 00:17:24.215 { 00:17:24.215 "name": "BaseBdev4", 00:17:24.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.215 "is_configured": false, 00:17:24.215 "data_offset": 0, 00:17:24.215 "data_size": 0 00:17:24.215 } 00:17:24.215 ] 00:17:24.215 }' 00:17:24.215 10:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.215 10:12:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.784 10:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:25.044 [2024-06-10 10:12:46.683012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:25.044 [2024-06-10 10:12:46.683036] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c74c0 00:17:25.044 [2024-06-10 10:12:46.683040] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:25.044 [2024-06-10 10:12:46.683182] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bf8a0 00:17:25.044 [2024-06-10 10:12:46.683274] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c74c0 00:17:25.044 [2024-06-10 10:12:46.683279] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25c74c0 00:17:25.044 [2024-06-10 10:12:46.683395] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:25.044 BaseBdev4 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:25.044 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.045 10:12:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:25.304 [ 00:17:25.304 { 00:17:25.304 "name": "BaseBdev4", 00:17:25.304 "aliases": [ 00:17:25.304 "e865595c-ba50-4925-99d2-55f987d607de" 00:17:25.304 ], 00:17:25.304 "product_name": "Malloc disk", 00:17:25.304 "block_size": 512, 00:17:25.304 "num_blocks": 65536, 00:17:25.304 "uuid": "e865595c-ba50-4925-99d2-55f987d607de", 00:17:25.304 "assigned_rate_limits": { 00:17:25.304 "rw_ios_per_sec": 0, 00:17:25.304 "rw_mbytes_per_sec": 0, 00:17:25.304 "r_mbytes_per_sec": 0, 00:17:25.304 "w_mbytes_per_sec": 0 00:17:25.304 }, 00:17:25.304 "claimed": true, 00:17:25.304 "claim_type": "exclusive_write", 00:17:25.304 "zoned": false, 00:17:25.304 "supported_io_types": { 00:17:25.304 "read": true, 00:17:25.304 "write": true, 00:17:25.304 "unmap": true, 00:17:25.304 "write_zeroes": true, 00:17:25.304 "flush": true, 00:17:25.304 "reset": true, 00:17:25.304 "compare": false, 00:17:25.304 "compare_and_write": false, 00:17:25.304 "abort": true, 00:17:25.304 "nvme_admin": false, 00:17:25.304 "nvme_io": false 00:17:25.304 }, 00:17:25.304 "memory_domains": [ 00:17:25.304 { 00:17:25.304 "dma_device_id": "system", 00:17:25.304 "dma_device_type": 1 00:17:25.304 }, 00:17:25.304 { 00:17:25.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.304 "dma_device_type": 2 00:17:25.304 } 00:17:25.304 ], 00:17:25.304 "driver_specific": {} 00:17:25.304 } 00:17:25.304 ] 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.304 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.564 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.564 "name": "Existed_Raid", 00:17:25.564 "uuid": "9e0785a9-dfd9-47a5-a3b0-f1689735315b", 00:17:25.564 "strip_size_kb": 64, 00:17:25.564 "state": "online", 00:17:25.564 "raid_level": "concat", 00:17:25.564 "superblock": false, 00:17:25.564 "num_base_bdevs": 4, 00:17:25.564 "num_base_bdevs_discovered": 4, 00:17:25.564 "num_base_bdevs_operational": 4, 00:17:25.564 "base_bdevs_list": [ 00:17:25.564 { 00:17:25.564 "name": "BaseBdev1", 00:17:25.564 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:25.564 "is_configured": true, 00:17:25.564 "data_offset": 0, 00:17:25.564 "data_size": 65536 00:17:25.564 }, 00:17:25.564 { 00:17:25.564 "name": "BaseBdev2", 00:17:25.564 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:25.564 "is_configured": true, 00:17:25.564 "data_offset": 0, 00:17:25.564 "data_size": 65536 00:17:25.564 }, 00:17:25.564 { 00:17:25.564 "name": "BaseBdev3", 00:17:25.564 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:25.564 "is_configured": true, 00:17:25.564 "data_offset": 0, 00:17:25.564 "data_size": 65536 00:17:25.564 }, 00:17:25.564 { 00:17:25.564 "name": "BaseBdev4", 00:17:25.564 "uuid": "e865595c-ba50-4925-99d2-55f987d607de", 00:17:25.564 "is_configured": true, 00:17:25.564 "data_offset": 0, 00:17:25.564 "data_size": 65536 00:17:25.564 } 00:17:25.564 ] 00:17:25.564 }' 00:17:25.564 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.564 10:12:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:26.134 10:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:26.134 [2024-06-10 10:12:47.986558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:26.394 "name": "Existed_Raid", 00:17:26.394 "aliases": [ 00:17:26.394 "9e0785a9-dfd9-47a5-a3b0-f1689735315b" 00:17:26.394 ], 00:17:26.394 "product_name": "Raid Volume", 00:17:26.394 "block_size": 512, 00:17:26.394 "num_blocks": 262144, 00:17:26.394 "uuid": "9e0785a9-dfd9-47a5-a3b0-f1689735315b", 00:17:26.394 "assigned_rate_limits": { 00:17:26.394 "rw_ios_per_sec": 0, 00:17:26.394 "rw_mbytes_per_sec": 0, 00:17:26.394 "r_mbytes_per_sec": 0, 00:17:26.394 "w_mbytes_per_sec": 0 00:17:26.394 }, 00:17:26.394 "claimed": false, 00:17:26.394 "zoned": false, 00:17:26.394 "supported_io_types": { 00:17:26.394 "read": true, 00:17:26.394 "write": true, 00:17:26.394 "unmap": true, 00:17:26.394 "write_zeroes": true, 00:17:26.394 "flush": true, 00:17:26.394 "reset": true, 00:17:26.394 "compare": false, 00:17:26.394 "compare_and_write": false, 00:17:26.394 "abort": false, 00:17:26.394 "nvme_admin": false, 00:17:26.394 "nvme_io": false 00:17:26.394 }, 00:17:26.394 "memory_domains": [ 00:17:26.394 { 00:17:26.394 "dma_device_id": "system", 00:17:26.394 "dma_device_type": 1 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.394 "dma_device_type": 2 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "system", 00:17:26.394 "dma_device_type": 1 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.394 "dma_device_type": 2 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "system", 00:17:26.394 "dma_device_type": 1 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.394 "dma_device_type": 2 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "system", 00:17:26.394 "dma_device_type": 1 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.394 "dma_device_type": 2 00:17:26.394 } 00:17:26.394 ], 00:17:26.394 "driver_specific": { 00:17:26.394 "raid": { 00:17:26.394 "uuid": "9e0785a9-dfd9-47a5-a3b0-f1689735315b", 00:17:26.394 "strip_size_kb": 64, 00:17:26.394 "state": "online", 00:17:26.394 "raid_level": "concat", 00:17:26.394 "superblock": false, 00:17:26.394 "num_base_bdevs": 4, 00:17:26.394 "num_base_bdevs_discovered": 4, 00:17:26.394 "num_base_bdevs_operational": 4, 00:17:26.394 "base_bdevs_list": [ 00:17:26.394 { 00:17:26.394 "name": "BaseBdev1", 00:17:26.394 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:26.394 "is_configured": true, 00:17:26.394 "data_offset": 0, 00:17:26.394 "data_size": 65536 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "name": "BaseBdev2", 00:17:26.394 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:26.394 "is_configured": true, 00:17:26.394 "data_offset": 0, 00:17:26.394 "data_size": 65536 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "name": "BaseBdev3", 00:17:26.394 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:26.394 "is_configured": true, 00:17:26.394 "data_offset": 0, 00:17:26.394 "data_size": 65536 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "name": "BaseBdev4", 00:17:26.394 "uuid": "e865595c-ba50-4925-99d2-55f987d607de", 00:17:26.394 "is_configured": true, 00:17:26.394 "data_offset": 0, 00:17:26.394 "data_size": 65536 00:17:26.394 } 00:17:26.394 ] 00:17:26.394 } 00:17:26.394 } 00:17:26.394 }' 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:26.394 BaseBdev2 00:17:26.394 BaseBdev3 00:17:26.394 BaseBdev4' 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:26.394 "name": "BaseBdev1", 00:17:26.394 "aliases": [ 00:17:26.394 "dc00fd4a-3661-417f-8447-22ad59ee56a2" 00:17:26.394 ], 00:17:26.394 "product_name": "Malloc disk", 00:17:26.394 "block_size": 512, 00:17:26.394 "num_blocks": 65536, 00:17:26.394 "uuid": "dc00fd4a-3661-417f-8447-22ad59ee56a2", 00:17:26.394 "assigned_rate_limits": { 00:17:26.394 "rw_ios_per_sec": 0, 00:17:26.394 "rw_mbytes_per_sec": 0, 00:17:26.394 "r_mbytes_per_sec": 0, 00:17:26.394 "w_mbytes_per_sec": 0 00:17:26.394 }, 00:17:26.394 "claimed": true, 00:17:26.394 "claim_type": "exclusive_write", 00:17:26.394 "zoned": false, 00:17:26.394 "supported_io_types": { 00:17:26.394 "read": true, 00:17:26.394 "write": true, 00:17:26.394 "unmap": true, 00:17:26.394 "write_zeroes": true, 00:17:26.394 "flush": true, 00:17:26.394 "reset": true, 00:17:26.394 "compare": false, 00:17:26.394 "compare_and_write": false, 00:17:26.394 "abort": true, 00:17:26.394 "nvme_admin": false, 00:17:26.394 "nvme_io": false 00:17:26.394 }, 00:17:26.394 "memory_domains": [ 00:17:26.394 { 00:17:26.394 "dma_device_id": "system", 00:17:26.394 "dma_device_type": 1 00:17:26.394 }, 00:17:26.394 { 00:17:26.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.394 "dma_device_type": 2 00:17:26.394 } 00:17:26.394 ], 00:17:26.394 "driver_specific": {} 00:17:26.394 }' 00:17:26.394 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.654 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.914 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.914 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.914 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.914 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:26.914 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:27.177 "name": "BaseBdev2", 00:17:27.177 "aliases": [ 00:17:27.177 "adb4ff25-f5c6-43f4-a392-a28d932dd00c" 00:17:27.177 ], 00:17:27.177 "product_name": "Malloc disk", 00:17:27.177 "block_size": 512, 00:17:27.177 "num_blocks": 65536, 00:17:27.177 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:27.177 "assigned_rate_limits": { 00:17:27.177 "rw_ios_per_sec": 0, 00:17:27.177 "rw_mbytes_per_sec": 0, 00:17:27.177 "r_mbytes_per_sec": 0, 00:17:27.177 "w_mbytes_per_sec": 0 00:17:27.177 }, 00:17:27.177 "claimed": true, 00:17:27.177 "claim_type": "exclusive_write", 00:17:27.177 "zoned": false, 00:17:27.177 "supported_io_types": { 00:17:27.177 "read": true, 00:17:27.177 "write": true, 00:17:27.177 "unmap": true, 00:17:27.177 "write_zeroes": true, 00:17:27.177 "flush": true, 00:17:27.177 "reset": true, 00:17:27.177 "compare": false, 00:17:27.177 "compare_and_write": false, 00:17:27.177 "abort": true, 00:17:27.177 "nvme_admin": false, 00:17:27.177 "nvme_io": false 00:17:27.177 }, 00:17:27.177 "memory_domains": [ 00:17:27.177 { 00:17:27.177 "dma_device_id": "system", 00:17:27.177 "dma_device_type": 1 00:17:27.177 }, 00:17:27.177 { 00:17:27.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.177 "dma_device_type": 2 00:17:27.177 } 00:17:27.177 ], 00:17:27.177 "driver_specific": {} 00:17:27.177 }' 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:27.177 10:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.177 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:27.503 "name": "BaseBdev3", 00:17:27.503 "aliases": [ 00:17:27.503 "fd4a0433-84b0-47c2-a29a-90479861f3ff" 00:17:27.503 ], 00:17:27.503 "product_name": "Malloc disk", 00:17:27.503 "block_size": 512, 00:17:27.503 "num_blocks": 65536, 00:17:27.503 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:27.503 "assigned_rate_limits": { 00:17:27.503 "rw_ios_per_sec": 0, 00:17:27.503 "rw_mbytes_per_sec": 0, 00:17:27.503 "r_mbytes_per_sec": 0, 00:17:27.503 "w_mbytes_per_sec": 0 00:17:27.503 }, 00:17:27.503 "claimed": true, 00:17:27.503 "claim_type": "exclusive_write", 00:17:27.503 "zoned": false, 00:17:27.503 "supported_io_types": { 00:17:27.503 "read": true, 00:17:27.503 "write": true, 00:17:27.503 "unmap": true, 00:17:27.503 "write_zeroes": true, 00:17:27.503 "flush": true, 00:17:27.503 "reset": true, 00:17:27.503 "compare": false, 00:17:27.503 "compare_and_write": false, 00:17:27.503 "abort": true, 00:17:27.503 "nvme_admin": false, 00:17:27.503 "nvme_io": false 00:17:27.503 }, 00:17:27.503 "memory_domains": [ 00:17:27.503 { 00:17:27.503 "dma_device_id": "system", 00:17:27.503 "dma_device_type": 1 00:17:27.503 }, 00:17:27.503 { 00:17:27.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.503 "dma_device_type": 2 00:17:27.503 } 00:17:27.503 ], 00:17:27.503 "driver_specific": {} 00:17:27.503 }' 00:17:27.503 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:27.788 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:28.048 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:28.048 "name": "BaseBdev4", 00:17:28.048 "aliases": [ 00:17:28.048 "e865595c-ba50-4925-99d2-55f987d607de" 00:17:28.048 ], 00:17:28.048 "product_name": "Malloc disk", 00:17:28.048 "block_size": 512, 00:17:28.048 "num_blocks": 65536, 00:17:28.048 "uuid": "e865595c-ba50-4925-99d2-55f987d607de", 00:17:28.048 "assigned_rate_limits": { 00:17:28.048 "rw_ios_per_sec": 0, 00:17:28.048 "rw_mbytes_per_sec": 0, 00:17:28.048 "r_mbytes_per_sec": 0, 00:17:28.049 "w_mbytes_per_sec": 0 00:17:28.049 }, 00:17:28.049 "claimed": true, 00:17:28.049 "claim_type": "exclusive_write", 00:17:28.049 "zoned": false, 00:17:28.049 "supported_io_types": { 00:17:28.049 "read": true, 00:17:28.049 "write": true, 00:17:28.049 "unmap": true, 00:17:28.049 "write_zeroes": true, 00:17:28.049 "flush": true, 00:17:28.049 "reset": true, 00:17:28.049 "compare": false, 00:17:28.049 "compare_and_write": false, 00:17:28.049 "abort": true, 00:17:28.049 "nvme_admin": false, 00:17:28.049 "nvme_io": false 00:17:28.049 }, 00:17:28.049 "memory_domains": [ 00:17:28.049 { 00:17:28.049 "dma_device_id": "system", 00:17:28.049 "dma_device_type": 1 00:17:28.049 }, 00:17:28.049 { 00:17:28.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.049 "dma_device_type": 2 00:17:28.049 } 00:17:28.049 ], 00:17:28.049 "driver_specific": {} 00:17:28.049 }' 00:17:28.049 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.308 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.308 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:28.308 10:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:28.309 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:28.569 [2024-06-10 10:12:50.384436] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:28.569 [2024-06-10 10:12:50.384453] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:28.569 [2024-06-10 10:12:50.384490] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.569 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.829 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.829 "name": "Existed_Raid", 00:17:28.829 "uuid": "9e0785a9-dfd9-47a5-a3b0-f1689735315b", 00:17:28.829 "strip_size_kb": 64, 00:17:28.829 "state": "offline", 00:17:28.829 "raid_level": "concat", 00:17:28.829 "superblock": false, 00:17:28.829 "num_base_bdevs": 4, 00:17:28.829 "num_base_bdevs_discovered": 3, 00:17:28.829 "num_base_bdevs_operational": 3, 00:17:28.829 "base_bdevs_list": [ 00:17:28.829 { 00:17:28.829 "name": null, 00:17:28.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.829 "is_configured": false, 00:17:28.829 "data_offset": 0, 00:17:28.829 "data_size": 65536 00:17:28.829 }, 00:17:28.829 { 00:17:28.829 "name": "BaseBdev2", 00:17:28.829 "uuid": "adb4ff25-f5c6-43f4-a392-a28d932dd00c", 00:17:28.829 "is_configured": true, 00:17:28.829 "data_offset": 0, 00:17:28.829 "data_size": 65536 00:17:28.829 }, 00:17:28.829 { 00:17:28.830 "name": "BaseBdev3", 00:17:28.830 "uuid": "fd4a0433-84b0-47c2-a29a-90479861f3ff", 00:17:28.830 "is_configured": true, 00:17:28.830 "data_offset": 0, 00:17:28.830 "data_size": 65536 00:17:28.830 }, 00:17:28.830 { 00:17:28.830 "name": "BaseBdev4", 00:17:28.830 "uuid": "e865595c-ba50-4925-99d2-55f987d607de", 00:17:28.830 "is_configured": true, 00:17:28.830 "data_offset": 0, 00:17:28.830 "data_size": 65536 00:17:28.830 } 00:17:28.830 ] 00:17:28.830 }' 00:17:28.830 10:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.830 10:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.400 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:29.400 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:29.400 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.400 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:29.659 [2024-06-10 10:12:51.503266] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:29.659 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.920 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:29.920 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:29.920 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:30.181 [2024-06-10 10:12:51.890029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:30.181 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:30.181 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:30.181 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.181 10:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:30.512 [2024-06-10 10:12:52.276739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:30.512 [2024-06-10 10:12:52.276766] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c74c0 name Existed_Raid, state offline 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.512 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:30.772 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:31.033 BaseBdev2 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:31.033 10:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:31.293 [ 00:17:31.293 { 00:17:31.293 "name": "BaseBdev2", 00:17:31.293 "aliases": [ 00:17:31.293 "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2" 00:17:31.293 ], 00:17:31.293 "product_name": "Malloc disk", 00:17:31.293 "block_size": 512, 00:17:31.293 "num_blocks": 65536, 00:17:31.293 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:31.293 "assigned_rate_limits": { 00:17:31.293 "rw_ios_per_sec": 0, 00:17:31.293 "rw_mbytes_per_sec": 0, 00:17:31.293 "r_mbytes_per_sec": 0, 00:17:31.293 "w_mbytes_per_sec": 0 00:17:31.293 }, 00:17:31.293 "claimed": false, 00:17:31.293 "zoned": false, 00:17:31.293 "supported_io_types": { 00:17:31.293 "read": true, 00:17:31.293 "write": true, 00:17:31.293 "unmap": true, 00:17:31.293 "write_zeroes": true, 00:17:31.293 "flush": true, 00:17:31.293 "reset": true, 00:17:31.293 "compare": false, 00:17:31.293 "compare_and_write": false, 00:17:31.293 "abort": true, 00:17:31.293 "nvme_admin": false, 00:17:31.293 "nvme_io": false 00:17:31.293 }, 00:17:31.293 "memory_domains": [ 00:17:31.293 { 00:17:31.293 "dma_device_id": "system", 00:17:31.293 "dma_device_type": 1 00:17:31.293 }, 00:17:31.293 { 00:17:31.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.293 "dma_device_type": 2 00:17:31.293 } 00:17:31.293 ], 00:17:31.293 "driver_specific": {} 00:17:31.293 } 00:17:31.293 ] 00:17:31.293 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:31.293 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:31.293 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:31.293 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:31.552 BaseBdev3 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:31.552 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:31.812 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:31.812 [ 00:17:31.812 { 00:17:31.812 "name": "BaseBdev3", 00:17:31.812 "aliases": [ 00:17:31.812 "6096aea9-aee6-44ff-84f4-837beecded21" 00:17:31.812 ], 00:17:31.812 "product_name": "Malloc disk", 00:17:31.812 "block_size": 512, 00:17:31.812 "num_blocks": 65536, 00:17:31.812 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:31.812 "assigned_rate_limits": { 00:17:31.812 "rw_ios_per_sec": 0, 00:17:31.812 "rw_mbytes_per_sec": 0, 00:17:31.812 "r_mbytes_per_sec": 0, 00:17:31.812 "w_mbytes_per_sec": 0 00:17:31.812 }, 00:17:31.812 "claimed": false, 00:17:31.812 "zoned": false, 00:17:31.812 "supported_io_types": { 00:17:31.812 "read": true, 00:17:31.812 "write": true, 00:17:31.812 "unmap": true, 00:17:31.812 "write_zeroes": true, 00:17:31.812 "flush": true, 00:17:31.812 "reset": true, 00:17:31.812 "compare": false, 00:17:31.812 "compare_and_write": false, 00:17:31.812 "abort": true, 00:17:31.812 "nvme_admin": false, 00:17:31.812 "nvme_io": false 00:17:31.812 }, 00:17:31.812 "memory_domains": [ 00:17:31.812 { 00:17:31.812 "dma_device_id": "system", 00:17:31.812 "dma_device_type": 1 00:17:31.812 }, 00:17:31.812 { 00:17:31.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.812 "dma_device_type": 2 00:17:31.812 } 00:17:31.812 ], 00:17:31.812 "driver_specific": {} 00:17:31.812 } 00:17:31.812 ] 00:17:31.812 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:31.812 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:31.812 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:31.812 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:32.072 BaseBdev4 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:32.072 10:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:32.332 10:12:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:32.591 [ 00:17:32.591 { 00:17:32.591 "name": "BaseBdev4", 00:17:32.591 "aliases": [ 00:17:32.591 "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d" 00:17:32.591 ], 00:17:32.591 "product_name": "Malloc disk", 00:17:32.591 "block_size": 512, 00:17:32.591 "num_blocks": 65536, 00:17:32.591 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:32.591 "assigned_rate_limits": { 00:17:32.591 "rw_ios_per_sec": 0, 00:17:32.591 "rw_mbytes_per_sec": 0, 00:17:32.591 "r_mbytes_per_sec": 0, 00:17:32.591 "w_mbytes_per_sec": 0 00:17:32.591 }, 00:17:32.591 "claimed": false, 00:17:32.591 "zoned": false, 00:17:32.591 "supported_io_types": { 00:17:32.591 "read": true, 00:17:32.591 "write": true, 00:17:32.591 "unmap": true, 00:17:32.591 "write_zeroes": true, 00:17:32.591 "flush": true, 00:17:32.591 "reset": true, 00:17:32.591 "compare": false, 00:17:32.591 "compare_and_write": false, 00:17:32.591 "abort": true, 00:17:32.591 "nvme_admin": false, 00:17:32.591 "nvme_io": false 00:17:32.591 }, 00:17:32.591 "memory_domains": [ 00:17:32.591 { 00:17:32.591 "dma_device_id": "system", 00:17:32.591 "dma_device_type": 1 00:17:32.591 }, 00:17:32.591 { 00:17:32.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.591 "dma_device_type": 2 00:17:32.591 } 00:17:32.591 ], 00:17:32.591 "driver_specific": {} 00:17:32.591 } 00:17:32.591 ] 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:32.591 [2024-06-10 10:12:54.387731] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:32.591 [2024-06-10 10:12:54.387759] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:32.591 [2024-06-10 10:12:54.387771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:32.591 [2024-06-10 10:12:54.388791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.591 [2024-06-10 10:12:54.388828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.591 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.851 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.851 "name": "Existed_Raid", 00:17:32.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.851 "strip_size_kb": 64, 00:17:32.851 "state": "configuring", 00:17:32.851 "raid_level": "concat", 00:17:32.851 "superblock": false, 00:17:32.851 "num_base_bdevs": 4, 00:17:32.851 "num_base_bdevs_discovered": 3, 00:17:32.851 "num_base_bdevs_operational": 4, 00:17:32.851 "base_bdevs_list": [ 00:17:32.851 { 00:17:32.851 "name": "BaseBdev1", 00:17:32.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.851 "is_configured": false, 00:17:32.851 "data_offset": 0, 00:17:32.851 "data_size": 0 00:17:32.851 }, 00:17:32.851 { 00:17:32.851 "name": "BaseBdev2", 00:17:32.851 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:32.851 "is_configured": true, 00:17:32.851 "data_offset": 0, 00:17:32.851 "data_size": 65536 00:17:32.851 }, 00:17:32.851 { 00:17:32.851 "name": "BaseBdev3", 00:17:32.851 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:32.851 "is_configured": true, 00:17:32.851 "data_offset": 0, 00:17:32.851 "data_size": 65536 00:17:32.851 }, 00:17:32.851 { 00:17:32.851 "name": "BaseBdev4", 00:17:32.851 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:32.851 "is_configured": true, 00:17:32.851 "data_offset": 0, 00:17:32.851 "data_size": 65536 00:17:32.851 } 00:17:32.851 ] 00:17:32.851 }' 00:17:32.851 10:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.851 10:12:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.422 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:33.680 [2024-06-10 10:12:55.298006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.680 "name": "Existed_Raid", 00:17:33.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.680 "strip_size_kb": 64, 00:17:33.680 "state": "configuring", 00:17:33.680 "raid_level": "concat", 00:17:33.680 "superblock": false, 00:17:33.680 "num_base_bdevs": 4, 00:17:33.680 "num_base_bdevs_discovered": 2, 00:17:33.680 "num_base_bdevs_operational": 4, 00:17:33.680 "base_bdevs_list": [ 00:17:33.680 { 00:17:33.680 "name": "BaseBdev1", 00:17:33.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.680 "is_configured": false, 00:17:33.680 "data_offset": 0, 00:17:33.680 "data_size": 0 00:17:33.680 }, 00:17:33.680 { 00:17:33.680 "name": null, 00:17:33.680 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:33.680 "is_configured": false, 00:17:33.680 "data_offset": 0, 00:17:33.680 "data_size": 65536 00:17:33.680 }, 00:17:33.680 { 00:17:33.680 "name": "BaseBdev3", 00:17:33.680 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:33.680 "is_configured": true, 00:17:33.680 "data_offset": 0, 00:17:33.680 "data_size": 65536 00:17:33.680 }, 00:17:33.680 { 00:17:33.680 "name": "BaseBdev4", 00:17:33.680 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:33.680 "is_configured": true, 00:17:33.680 "data_offset": 0, 00:17:33.680 "data_size": 65536 00:17:33.680 } 00:17:33.680 ] 00:17:33.680 }' 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.680 10:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.249 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.249 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:34.508 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:34.508 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:34.767 [2024-06-10 10:12:56.429815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:34.767 BaseBdev1 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.767 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:35.027 [ 00:17:35.027 { 00:17:35.027 "name": "BaseBdev1", 00:17:35.027 "aliases": [ 00:17:35.027 "06cb0b5e-5338-4030-a2f3-b201ec104d79" 00:17:35.027 ], 00:17:35.027 "product_name": "Malloc disk", 00:17:35.027 "block_size": 512, 00:17:35.027 "num_blocks": 65536, 00:17:35.027 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:35.027 "assigned_rate_limits": { 00:17:35.027 "rw_ios_per_sec": 0, 00:17:35.027 "rw_mbytes_per_sec": 0, 00:17:35.027 "r_mbytes_per_sec": 0, 00:17:35.027 "w_mbytes_per_sec": 0 00:17:35.027 }, 00:17:35.027 "claimed": true, 00:17:35.027 "claim_type": "exclusive_write", 00:17:35.027 "zoned": false, 00:17:35.027 "supported_io_types": { 00:17:35.027 "read": true, 00:17:35.027 "write": true, 00:17:35.027 "unmap": true, 00:17:35.027 "write_zeroes": true, 00:17:35.027 "flush": true, 00:17:35.027 "reset": true, 00:17:35.027 "compare": false, 00:17:35.027 "compare_and_write": false, 00:17:35.027 "abort": true, 00:17:35.027 "nvme_admin": false, 00:17:35.027 "nvme_io": false 00:17:35.027 }, 00:17:35.027 "memory_domains": [ 00:17:35.027 { 00:17:35.027 "dma_device_id": "system", 00:17:35.027 "dma_device_type": 1 00:17:35.027 }, 00:17:35.027 { 00:17:35.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.027 "dma_device_type": 2 00:17:35.027 } 00:17:35.027 ], 00:17:35.027 "driver_specific": {} 00:17:35.027 } 00:17:35.027 ] 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.027 10:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.286 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.286 "name": "Existed_Raid", 00:17:35.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.286 "strip_size_kb": 64, 00:17:35.286 "state": "configuring", 00:17:35.286 "raid_level": "concat", 00:17:35.286 "superblock": false, 00:17:35.286 "num_base_bdevs": 4, 00:17:35.286 "num_base_bdevs_discovered": 3, 00:17:35.286 "num_base_bdevs_operational": 4, 00:17:35.286 "base_bdevs_list": [ 00:17:35.286 { 00:17:35.286 "name": "BaseBdev1", 00:17:35.286 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:35.286 "is_configured": true, 00:17:35.286 "data_offset": 0, 00:17:35.286 "data_size": 65536 00:17:35.286 }, 00:17:35.286 { 00:17:35.286 "name": null, 00:17:35.286 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:35.286 "is_configured": false, 00:17:35.286 "data_offset": 0, 00:17:35.286 "data_size": 65536 00:17:35.286 }, 00:17:35.286 { 00:17:35.286 "name": "BaseBdev3", 00:17:35.286 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:35.286 "is_configured": true, 00:17:35.286 "data_offset": 0, 00:17:35.286 "data_size": 65536 00:17:35.286 }, 00:17:35.286 { 00:17:35.286 "name": "BaseBdev4", 00:17:35.286 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:35.286 "is_configured": true, 00:17:35.286 "data_offset": 0, 00:17:35.286 "data_size": 65536 00:17:35.286 } 00:17:35.286 ] 00:17:35.286 }' 00:17:35.286 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.286 10:12:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.854 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.854 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:36.112 [2024-06-10 10:12:57.921609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.112 10:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.372 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.372 "name": "Existed_Raid", 00:17:36.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.372 "strip_size_kb": 64, 00:17:36.372 "state": "configuring", 00:17:36.372 "raid_level": "concat", 00:17:36.372 "superblock": false, 00:17:36.372 "num_base_bdevs": 4, 00:17:36.372 "num_base_bdevs_discovered": 2, 00:17:36.372 "num_base_bdevs_operational": 4, 00:17:36.372 "base_bdevs_list": [ 00:17:36.372 { 00:17:36.372 "name": "BaseBdev1", 00:17:36.372 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:36.372 "is_configured": true, 00:17:36.372 "data_offset": 0, 00:17:36.372 "data_size": 65536 00:17:36.372 }, 00:17:36.372 { 00:17:36.372 "name": null, 00:17:36.372 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:36.372 "is_configured": false, 00:17:36.372 "data_offset": 0, 00:17:36.372 "data_size": 65536 00:17:36.372 }, 00:17:36.372 { 00:17:36.372 "name": null, 00:17:36.372 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:36.372 "is_configured": false, 00:17:36.372 "data_offset": 0, 00:17:36.372 "data_size": 65536 00:17:36.372 }, 00:17:36.372 { 00:17:36.372 "name": "BaseBdev4", 00:17:36.372 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:36.372 "is_configured": true, 00:17:36.372 "data_offset": 0, 00:17:36.372 "data_size": 65536 00:17:36.372 } 00:17:36.372 ] 00:17:36.372 }' 00:17:36.372 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.372 10:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.942 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.942 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:37.202 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:37.202 10:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:37.202 [2024-06-10 10:12:58.992339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.202 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.462 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.462 "name": "Existed_Raid", 00:17:37.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.462 "strip_size_kb": 64, 00:17:37.462 "state": "configuring", 00:17:37.462 "raid_level": "concat", 00:17:37.462 "superblock": false, 00:17:37.462 "num_base_bdevs": 4, 00:17:37.462 "num_base_bdevs_discovered": 3, 00:17:37.462 "num_base_bdevs_operational": 4, 00:17:37.462 "base_bdevs_list": [ 00:17:37.462 { 00:17:37.462 "name": "BaseBdev1", 00:17:37.462 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:37.462 "is_configured": true, 00:17:37.462 "data_offset": 0, 00:17:37.462 "data_size": 65536 00:17:37.462 }, 00:17:37.462 { 00:17:37.462 "name": null, 00:17:37.462 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:37.462 "is_configured": false, 00:17:37.462 "data_offset": 0, 00:17:37.462 "data_size": 65536 00:17:37.462 }, 00:17:37.462 { 00:17:37.462 "name": "BaseBdev3", 00:17:37.462 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:37.462 "is_configured": true, 00:17:37.462 "data_offset": 0, 00:17:37.462 "data_size": 65536 00:17:37.462 }, 00:17:37.462 { 00:17:37.462 "name": "BaseBdev4", 00:17:37.462 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:37.462 "is_configured": true, 00:17:37.462 "data_offset": 0, 00:17:37.462 "data_size": 65536 00:17:37.462 } 00:17:37.462 ] 00:17:37.462 }' 00:17:37.462 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.462 10:12:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.031 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:38.031 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.290 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:38.291 10:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:38.291 [2024-06-10 10:13:00.087124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.291 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.560 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.560 "name": "Existed_Raid", 00:17:38.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.560 "strip_size_kb": 64, 00:17:38.560 "state": "configuring", 00:17:38.560 "raid_level": "concat", 00:17:38.560 "superblock": false, 00:17:38.560 "num_base_bdevs": 4, 00:17:38.560 "num_base_bdevs_discovered": 2, 00:17:38.560 "num_base_bdevs_operational": 4, 00:17:38.560 "base_bdevs_list": [ 00:17:38.560 { 00:17:38.560 "name": null, 00:17:38.560 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:38.560 "is_configured": false, 00:17:38.560 "data_offset": 0, 00:17:38.560 "data_size": 65536 00:17:38.560 }, 00:17:38.560 { 00:17:38.560 "name": null, 00:17:38.560 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:38.560 "is_configured": false, 00:17:38.560 "data_offset": 0, 00:17:38.560 "data_size": 65536 00:17:38.560 }, 00:17:38.560 { 00:17:38.560 "name": "BaseBdev3", 00:17:38.560 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:38.560 "is_configured": true, 00:17:38.560 "data_offset": 0, 00:17:38.560 "data_size": 65536 00:17:38.560 }, 00:17:38.560 { 00:17:38.560 "name": "BaseBdev4", 00:17:38.560 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:38.560 "is_configured": true, 00:17:38.560 "data_offset": 0, 00:17:38.560 "data_size": 65536 00:17:38.560 } 00:17:38.560 ] 00:17:38.560 }' 00:17:38.560 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.560 10:13:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.169 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.169 10:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:39.434 [2024-06-10 10:13:01.223476] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.434 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.695 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.695 "name": "Existed_Raid", 00:17:39.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.695 "strip_size_kb": 64, 00:17:39.695 "state": "configuring", 00:17:39.695 "raid_level": "concat", 00:17:39.695 "superblock": false, 00:17:39.695 "num_base_bdevs": 4, 00:17:39.695 "num_base_bdevs_discovered": 3, 00:17:39.695 "num_base_bdevs_operational": 4, 00:17:39.695 "base_bdevs_list": [ 00:17:39.695 { 00:17:39.695 "name": null, 00:17:39.695 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:39.695 "is_configured": false, 00:17:39.695 "data_offset": 0, 00:17:39.695 "data_size": 65536 00:17:39.695 }, 00:17:39.695 { 00:17:39.695 "name": "BaseBdev2", 00:17:39.695 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:39.695 "is_configured": true, 00:17:39.695 "data_offset": 0, 00:17:39.695 "data_size": 65536 00:17:39.695 }, 00:17:39.695 { 00:17:39.695 "name": "BaseBdev3", 00:17:39.695 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:39.695 "is_configured": true, 00:17:39.695 "data_offset": 0, 00:17:39.695 "data_size": 65536 00:17:39.695 }, 00:17:39.695 { 00:17:39.695 "name": "BaseBdev4", 00:17:39.695 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:39.695 "is_configured": true, 00:17:39.695 "data_offset": 0, 00:17:39.695 "data_size": 65536 00:17:39.695 } 00:17:39.695 ] 00:17:39.695 }' 00:17:39.695 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.695 10:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.264 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.264 10:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:40.528 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:40.528 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:40.528 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.528 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 06cb0b5e-5338-4030-a2f3-b201ec104d79 00:17:40.795 [2024-06-10 10:13:02.515559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:40.795 [2024-06-10 10:13:02.515582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c0490 00:17:40.795 [2024-06-10 10:13:02.515586] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:40.795 [2024-06-10 10:13:02.515732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c0ca0 00:17:40.795 [2024-06-10 10:13:02.515830] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c0490 00:17:40.795 [2024-06-10 10:13:02.515836] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25c0490 00:17:40.795 [2024-06-10 10:13:02.515954] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:40.795 NewBaseBdev 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:40.795 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:41.056 [ 00:17:41.056 { 00:17:41.056 "name": "NewBaseBdev", 00:17:41.056 "aliases": [ 00:17:41.056 "06cb0b5e-5338-4030-a2f3-b201ec104d79" 00:17:41.056 ], 00:17:41.056 "product_name": "Malloc disk", 00:17:41.056 "block_size": 512, 00:17:41.056 "num_blocks": 65536, 00:17:41.056 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:41.056 "assigned_rate_limits": { 00:17:41.056 "rw_ios_per_sec": 0, 00:17:41.056 "rw_mbytes_per_sec": 0, 00:17:41.056 "r_mbytes_per_sec": 0, 00:17:41.056 "w_mbytes_per_sec": 0 00:17:41.056 }, 00:17:41.056 "claimed": true, 00:17:41.056 "claim_type": "exclusive_write", 00:17:41.056 "zoned": false, 00:17:41.056 "supported_io_types": { 00:17:41.056 "read": true, 00:17:41.056 "write": true, 00:17:41.056 "unmap": true, 00:17:41.056 "write_zeroes": true, 00:17:41.056 "flush": true, 00:17:41.056 "reset": true, 00:17:41.056 "compare": false, 00:17:41.056 "compare_and_write": false, 00:17:41.056 "abort": true, 00:17:41.056 "nvme_admin": false, 00:17:41.056 "nvme_io": false 00:17:41.056 }, 00:17:41.056 "memory_domains": [ 00:17:41.056 { 00:17:41.056 "dma_device_id": "system", 00:17:41.056 "dma_device_type": 1 00:17:41.056 }, 00:17:41.056 { 00:17:41.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.056 "dma_device_type": 2 00:17:41.056 } 00:17:41.056 ], 00:17:41.056 "driver_specific": {} 00:17:41.056 } 00:17:41.056 ] 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.056 10:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.317 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.317 "name": "Existed_Raid", 00:17:41.317 "uuid": "a5f4cb6c-6a8b-49f6-907d-fbcd620b1ede", 00:17:41.317 "strip_size_kb": 64, 00:17:41.317 "state": "online", 00:17:41.317 "raid_level": "concat", 00:17:41.317 "superblock": false, 00:17:41.317 "num_base_bdevs": 4, 00:17:41.317 "num_base_bdevs_discovered": 4, 00:17:41.317 "num_base_bdevs_operational": 4, 00:17:41.317 "base_bdevs_list": [ 00:17:41.317 { 00:17:41.317 "name": "NewBaseBdev", 00:17:41.317 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:41.317 "is_configured": true, 00:17:41.317 "data_offset": 0, 00:17:41.317 "data_size": 65536 00:17:41.317 }, 00:17:41.317 { 00:17:41.317 "name": "BaseBdev2", 00:17:41.317 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:41.317 "is_configured": true, 00:17:41.317 "data_offset": 0, 00:17:41.317 "data_size": 65536 00:17:41.317 }, 00:17:41.317 { 00:17:41.317 "name": "BaseBdev3", 00:17:41.317 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:41.317 "is_configured": true, 00:17:41.317 "data_offset": 0, 00:17:41.317 "data_size": 65536 00:17:41.317 }, 00:17:41.317 { 00:17:41.317 "name": "BaseBdev4", 00:17:41.317 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:41.317 "is_configured": true, 00:17:41.317 "data_offset": 0, 00:17:41.317 "data_size": 65536 00:17:41.317 } 00:17:41.317 ] 00:17:41.317 }' 00:17:41.317 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.317 10:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:41.886 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:42.147 [2024-06-10 10:13:03.762946] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:42.147 "name": "Existed_Raid", 00:17:42.147 "aliases": [ 00:17:42.147 "a5f4cb6c-6a8b-49f6-907d-fbcd620b1ede" 00:17:42.147 ], 00:17:42.147 "product_name": "Raid Volume", 00:17:42.147 "block_size": 512, 00:17:42.147 "num_blocks": 262144, 00:17:42.147 "uuid": "a5f4cb6c-6a8b-49f6-907d-fbcd620b1ede", 00:17:42.147 "assigned_rate_limits": { 00:17:42.147 "rw_ios_per_sec": 0, 00:17:42.147 "rw_mbytes_per_sec": 0, 00:17:42.147 "r_mbytes_per_sec": 0, 00:17:42.147 "w_mbytes_per_sec": 0 00:17:42.147 }, 00:17:42.147 "claimed": false, 00:17:42.147 "zoned": false, 00:17:42.147 "supported_io_types": { 00:17:42.147 "read": true, 00:17:42.147 "write": true, 00:17:42.147 "unmap": true, 00:17:42.147 "write_zeroes": true, 00:17:42.147 "flush": true, 00:17:42.147 "reset": true, 00:17:42.147 "compare": false, 00:17:42.147 "compare_and_write": false, 00:17:42.147 "abort": false, 00:17:42.147 "nvme_admin": false, 00:17:42.147 "nvme_io": false 00:17:42.147 }, 00:17:42.147 "memory_domains": [ 00:17:42.147 { 00:17:42.147 "dma_device_id": "system", 00:17:42.147 "dma_device_type": 1 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.147 "dma_device_type": 2 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "system", 00:17:42.147 "dma_device_type": 1 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.147 "dma_device_type": 2 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "system", 00:17:42.147 "dma_device_type": 1 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.147 "dma_device_type": 2 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "system", 00:17:42.147 "dma_device_type": 1 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.147 "dma_device_type": 2 00:17:42.147 } 00:17:42.147 ], 00:17:42.147 "driver_specific": { 00:17:42.147 "raid": { 00:17:42.147 "uuid": "a5f4cb6c-6a8b-49f6-907d-fbcd620b1ede", 00:17:42.147 "strip_size_kb": 64, 00:17:42.147 "state": "online", 00:17:42.147 "raid_level": "concat", 00:17:42.147 "superblock": false, 00:17:42.147 "num_base_bdevs": 4, 00:17:42.147 "num_base_bdevs_discovered": 4, 00:17:42.147 "num_base_bdevs_operational": 4, 00:17:42.147 "base_bdevs_list": [ 00:17:42.147 { 00:17:42.147 "name": "NewBaseBdev", 00:17:42.147 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:42.147 "is_configured": true, 00:17:42.147 "data_offset": 0, 00:17:42.147 "data_size": 65536 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "name": "BaseBdev2", 00:17:42.147 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:42.147 "is_configured": true, 00:17:42.147 "data_offset": 0, 00:17:42.147 "data_size": 65536 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "name": "BaseBdev3", 00:17:42.147 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:42.147 "is_configured": true, 00:17:42.147 "data_offset": 0, 00:17:42.147 "data_size": 65536 00:17:42.147 }, 00:17:42.147 { 00:17:42.147 "name": "BaseBdev4", 00:17:42.147 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:42.147 "is_configured": true, 00:17:42.147 "data_offset": 0, 00:17:42.147 "data_size": 65536 00:17:42.147 } 00:17:42.147 ] 00:17:42.147 } 00:17:42.147 } 00:17:42.147 }' 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:42.147 BaseBdev2 00:17:42.147 BaseBdev3 00:17:42.147 BaseBdev4' 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:42.147 10:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.408 "name": "NewBaseBdev", 00:17:42.408 "aliases": [ 00:17:42.408 "06cb0b5e-5338-4030-a2f3-b201ec104d79" 00:17:42.408 ], 00:17:42.408 "product_name": "Malloc disk", 00:17:42.408 "block_size": 512, 00:17:42.408 "num_blocks": 65536, 00:17:42.408 "uuid": "06cb0b5e-5338-4030-a2f3-b201ec104d79", 00:17:42.408 "assigned_rate_limits": { 00:17:42.408 "rw_ios_per_sec": 0, 00:17:42.408 "rw_mbytes_per_sec": 0, 00:17:42.408 "r_mbytes_per_sec": 0, 00:17:42.408 "w_mbytes_per_sec": 0 00:17:42.408 }, 00:17:42.408 "claimed": true, 00:17:42.408 "claim_type": "exclusive_write", 00:17:42.408 "zoned": false, 00:17:42.408 "supported_io_types": { 00:17:42.408 "read": true, 00:17:42.408 "write": true, 00:17:42.408 "unmap": true, 00:17:42.408 "write_zeroes": true, 00:17:42.408 "flush": true, 00:17:42.408 "reset": true, 00:17:42.408 "compare": false, 00:17:42.408 "compare_and_write": false, 00:17:42.408 "abort": true, 00:17:42.408 "nvme_admin": false, 00:17:42.408 "nvme_io": false 00:17:42.408 }, 00:17:42.408 "memory_domains": [ 00:17:42.408 { 00:17:42.408 "dma_device_id": "system", 00:17:42.408 "dma_device_type": 1 00:17:42.408 }, 00:17:42.408 { 00:17:42.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.408 "dma_device_type": 2 00:17:42.408 } 00:17:42.408 ], 00:17:42.408 "driver_specific": {} 00:17:42.408 }' 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.408 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:42.669 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.929 "name": "BaseBdev2", 00:17:42.929 "aliases": [ 00:17:42.929 "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2" 00:17:42.929 ], 00:17:42.929 "product_name": "Malloc disk", 00:17:42.929 "block_size": 512, 00:17:42.929 "num_blocks": 65536, 00:17:42.929 "uuid": "4c567d8e-ddcc-434b-a52a-9e1dd9bafbb2", 00:17:42.929 "assigned_rate_limits": { 00:17:42.929 "rw_ios_per_sec": 0, 00:17:42.929 "rw_mbytes_per_sec": 0, 00:17:42.929 "r_mbytes_per_sec": 0, 00:17:42.929 "w_mbytes_per_sec": 0 00:17:42.929 }, 00:17:42.929 "claimed": true, 00:17:42.929 "claim_type": "exclusive_write", 00:17:42.929 "zoned": false, 00:17:42.929 "supported_io_types": { 00:17:42.929 "read": true, 00:17:42.929 "write": true, 00:17:42.929 "unmap": true, 00:17:42.929 "write_zeroes": true, 00:17:42.929 "flush": true, 00:17:42.929 "reset": true, 00:17:42.929 "compare": false, 00:17:42.929 "compare_and_write": false, 00:17:42.929 "abort": true, 00:17:42.929 "nvme_admin": false, 00:17:42.929 "nvme_io": false 00:17:42.929 }, 00:17:42.929 "memory_domains": [ 00:17:42.929 { 00:17:42.929 "dma_device_id": "system", 00:17:42.929 "dma_device_type": 1 00:17:42.929 }, 00:17:42.929 { 00:17:42.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.929 "dma_device_type": 2 00:17:42.929 } 00:17:42.929 ], 00:17:42.929 "driver_specific": {} 00:17:42.929 }' 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.929 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:43.190 10:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.451 "name": "BaseBdev3", 00:17:43.451 "aliases": [ 00:17:43.451 "6096aea9-aee6-44ff-84f4-837beecded21" 00:17:43.451 ], 00:17:43.451 "product_name": "Malloc disk", 00:17:43.451 "block_size": 512, 00:17:43.451 "num_blocks": 65536, 00:17:43.451 "uuid": "6096aea9-aee6-44ff-84f4-837beecded21", 00:17:43.451 "assigned_rate_limits": { 00:17:43.451 "rw_ios_per_sec": 0, 00:17:43.451 "rw_mbytes_per_sec": 0, 00:17:43.451 "r_mbytes_per_sec": 0, 00:17:43.451 "w_mbytes_per_sec": 0 00:17:43.451 }, 00:17:43.451 "claimed": true, 00:17:43.451 "claim_type": "exclusive_write", 00:17:43.451 "zoned": false, 00:17:43.451 "supported_io_types": { 00:17:43.451 "read": true, 00:17:43.451 "write": true, 00:17:43.451 "unmap": true, 00:17:43.451 "write_zeroes": true, 00:17:43.451 "flush": true, 00:17:43.451 "reset": true, 00:17:43.451 "compare": false, 00:17:43.451 "compare_and_write": false, 00:17:43.451 "abort": true, 00:17:43.451 "nvme_admin": false, 00:17:43.451 "nvme_io": false 00:17:43.451 }, 00:17:43.451 "memory_domains": [ 00:17:43.451 { 00:17:43.451 "dma_device_id": "system", 00:17:43.451 "dma_device_type": 1 00:17:43.451 }, 00:17:43.451 { 00:17:43.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.451 "dma_device_type": 2 00:17:43.451 } 00:17:43.451 ], 00:17:43.451 "driver_specific": {} 00:17:43.451 }' 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.451 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.712 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.972 "name": "BaseBdev4", 00:17:43.972 "aliases": [ 00:17:43.972 "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d" 00:17:43.972 ], 00:17:43.972 "product_name": "Malloc disk", 00:17:43.972 "block_size": 512, 00:17:43.972 "num_blocks": 65536, 00:17:43.972 "uuid": "8a3f4c77-ee7c-475d-9f9c-d526dfe3191d", 00:17:43.972 "assigned_rate_limits": { 00:17:43.972 "rw_ios_per_sec": 0, 00:17:43.972 "rw_mbytes_per_sec": 0, 00:17:43.972 "r_mbytes_per_sec": 0, 00:17:43.972 "w_mbytes_per_sec": 0 00:17:43.972 }, 00:17:43.972 "claimed": true, 00:17:43.972 "claim_type": "exclusive_write", 00:17:43.972 "zoned": false, 00:17:43.972 "supported_io_types": { 00:17:43.972 "read": true, 00:17:43.972 "write": true, 00:17:43.972 "unmap": true, 00:17:43.972 "write_zeroes": true, 00:17:43.972 "flush": true, 00:17:43.972 "reset": true, 00:17:43.972 "compare": false, 00:17:43.972 "compare_and_write": false, 00:17:43.972 "abort": true, 00:17:43.972 "nvme_admin": false, 00:17:43.972 "nvme_io": false 00:17:43.972 }, 00:17:43.972 "memory_domains": [ 00:17:43.972 { 00:17:43.972 "dma_device_id": "system", 00:17:43.972 "dma_device_type": 1 00:17:43.972 }, 00:17:43.972 { 00:17:43.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.972 "dma_device_type": 2 00:17:43.972 } 00:17:43.972 ], 00:17:43.972 "driver_specific": {} 00:17:43.972 }' 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.972 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.973 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.234 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.234 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.234 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.234 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.234 10:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:44.496 [2024-06-10 10:13:06.132731] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:44.496 [2024-06-10 10:13:06.132751] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:44.496 [2024-06-10 10:13:06.132789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:44.496 [2024-06-10 10:13:06.132842] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:44.496 [2024-06-10 10:13:06.132849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c0490 name Existed_Raid, state offline 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1037087 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1037087 ']' 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1037087 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1037087 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1037087' 00:17:44.496 killing process with pid 1037087 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1037087 00:17:44.496 [2024-06-10 10:13:06.200979] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1037087 00:17:44.496 [2024-06-10 10:13:06.221550] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:44.496 00:17:44.496 real 0m27.012s 00:17:44.496 user 0m50.660s 00:17:44.496 sys 0m3.942s 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:44.496 10:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.496 ************************************ 00:17:44.496 END TEST raid_state_function_test 00:17:44.496 ************************************ 00:17:44.757 10:13:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:17:44.757 10:13:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:44.757 10:13:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:44.757 10:13:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:44.757 ************************************ 00:17:44.757 START TEST raid_state_function_test_sb 00:17:44.757 ************************************ 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 true 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1042350 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1042350' 00:17:44.757 Process raid pid: 1042350 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1042350 /var/tmp/spdk-raid.sock 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1042350 ']' 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:44.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:44.757 10:13:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.757 [2024-06-10 10:13:06.483266] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:17:44.757 [2024-06-10 10:13:06.483311] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:44.757 [2024-06-10 10:13:06.549207] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:44.757 [2024-06-10 10:13:06.611352] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:45.018 [2024-06-10 10:13:06.649755] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:45.018 [2024-06-10 10:13:06.649777] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:45.590 10:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:45.590 10:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:17:45.590 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:45.590 [2024-06-10 10:13:07.452594] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:45.590 [2024-06-10 10:13:07.452622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:45.590 [2024-06-10 10:13:07.452628] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:45.590 [2024-06-10 10:13:07.452634] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:45.590 [2024-06-10 10:13:07.452644] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:45.590 [2024-06-10 10:13:07.452650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:45.590 [2024-06-10 10:13:07.452654] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:45.590 [2024-06-10 10:13:07.452659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.850 "name": "Existed_Raid", 00:17:45.850 "uuid": "8c74509b-6a2e-4569-a2cf-da12ef0508e9", 00:17:45.850 "strip_size_kb": 64, 00:17:45.850 "state": "configuring", 00:17:45.850 "raid_level": "concat", 00:17:45.850 "superblock": true, 00:17:45.850 "num_base_bdevs": 4, 00:17:45.850 "num_base_bdevs_discovered": 0, 00:17:45.850 "num_base_bdevs_operational": 4, 00:17:45.850 "base_bdevs_list": [ 00:17:45.850 { 00:17:45.850 "name": "BaseBdev1", 00:17:45.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.850 "is_configured": false, 00:17:45.850 "data_offset": 0, 00:17:45.850 "data_size": 0 00:17:45.850 }, 00:17:45.850 { 00:17:45.850 "name": "BaseBdev2", 00:17:45.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.850 "is_configured": false, 00:17:45.850 "data_offset": 0, 00:17:45.850 "data_size": 0 00:17:45.850 }, 00:17:45.850 { 00:17:45.850 "name": "BaseBdev3", 00:17:45.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.850 "is_configured": false, 00:17:45.850 "data_offset": 0, 00:17:45.850 "data_size": 0 00:17:45.850 }, 00:17:45.850 { 00:17:45.850 "name": "BaseBdev4", 00:17:45.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.850 "is_configured": false, 00:17:45.850 "data_offset": 0, 00:17:45.850 "data_size": 0 00:17:45.850 } 00:17:45.850 ] 00:17:45.850 }' 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.850 10:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:46.420 10:13:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:46.681 [2024-06-10 10:13:08.366791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:46.681 [2024-06-10 10:13:08.366807] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2794b20 name Existed_Raid, state configuring 00:17:46.681 10:13:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:46.942 [2024-06-10 10:13:08.559293] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:46.942 [2024-06-10 10:13:08.559308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:46.942 [2024-06-10 10:13:08.559319] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:46.942 [2024-06-10 10:13:08.559324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:46.942 [2024-06-10 10:13:08.559329] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:46.942 [2024-06-10 10:13:08.559334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:46.942 [2024-06-10 10:13:08.559339] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:46.942 [2024-06-10 10:13:08.559344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:46.942 [2024-06-10 10:13:08.746212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:46.942 BaseBdev1 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:46.942 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:47.204 10:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:47.465 [ 00:17:47.465 { 00:17:47.465 "name": "BaseBdev1", 00:17:47.465 "aliases": [ 00:17:47.465 "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca" 00:17:47.465 ], 00:17:47.465 "product_name": "Malloc disk", 00:17:47.465 "block_size": 512, 00:17:47.465 "num_blocks": 65536, 00:17:47.465 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:47.465 "assigned_rate_limits": { 00:17:47.465 "rw_ios_per_sec": 0, 00:17:47.465 "rw_mbytes_per_sec": 0, 00:17:47.465 "r_mbytes_per_sec": 0, 00:17:47.465 "w_mbytes_per_sec": 0 00:17:47.465 }, 00:17:47.465 "claimed": true, 00:17:47.465 "claim_type": "exclusive_write", 00:17:47.465 "zoned": false, 00:17:47.465 "supported_io_types": { 00:17:47.465 "read": true, 00:17:47.465 "write": true, 00:17:47.465 "unmap": true, 00:17:47.465 "write_zeroes": true, 00:17:47.465 "flush": true, 00:17:47.465 "reset": true, 00:17:47.465 "compare": false, 00:17:47.465 "compare_and_write": false, 00:17:47.465 "abort": true, 00:17:47.465 "nvme_admin": false, 00:17:47.465 "nvme_io": false 00:17:47.465 }, 00:17:47.465 "memory_domains": [ 00:17:47.465 { 00:17:47.465 "dma_device_id": "system", 00:17:47.465 "dma_device_type": 1 00:17:47.465 }, 00:17:47.465 { 00:17:47.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.465 "dma_device_type": 2 00:17:47.465 } 00:17:47.465 ], 00:17:47.465 "driver_specific": {} 00:17:47.465 } 00:17:47.465 ] 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.465 "name": "Existed_Raid", 00:17:47.465 "uuid": "9ec09aad-552e-4383-ad9b-d8e116eb3ed1", 00:17:47.465 "strip_size_kb": 64, 00:17:47.465 "state": "configuring", 00:17:47.465 "raid_level": "concat", 00:17:47.465 "superblock": true, 00:17:47.465 "num_base_bdevs": 4, 00:17:47.465 "num_base_bdevs_discovered": 1, 00:17:47.465 "num_base_bdevs_operational": 4, 00:17:47.465 "base_bdevs_list": [ 00:17:47.465 { 00:17:47.465 "name": "BaseBdev1", 00:17:47.465 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:47.465 "is_configured": true, 00:17:47.465 "data_offset": 2048, 00:17:47.465 "data_size": 63488 00:17:47.465 }, 00:17:47.465 { 00:17:47.465 "name": "BaseBdev2", 00:17:47.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.465 "is_configured": false, 00:17:47.465 "data_offset": 0, 00:17:47.465 "data_size": 0 00:17:47.465 }, 00:17:47.465 { 00:17:47.465 "name": "BaseBdev3", 00:17:47.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.465 "is_configured": false, 00:17:47.465 "data_offset": 0, 00:17:47.465 "data_size": 0 00:17:47.465 }, 00:17:47.465 { 00:17:47.465 "name": "BaseBdev4", 00:17:47.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.465 "is_configured": false, 00:17:47.465 "data_offset": 0, 00:17:47.465 "data_size": 0 00:17:47.465 } 00:17:47.465 ] 00:17:47.465 }' 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.465 10:13:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:48.037 10:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:48.297 [2024-06-10 10:13:10.013419] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:48.297 [2024-06-10 10:13:10.013444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27943b0 name Existed_Raid, state configuring 00:17:48.297 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:48.558 [2024-06-10 10:13:10.209955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:48.558 [2024-06-10 10:13:10.211103] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:48.558 [2024-06-10 10:13:10.211126] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:48.558 [2024-06-10 10:13:10.211132] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:48.558 [2024-06-10 10:13:10.211137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:48.558 [2024-06-10 10:13:10.211142] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:48.558 [2024-06-10 10:13:10.211147] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.558 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.559 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.559 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.559 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.559 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:48.819 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.819 "name": "Existed_Raid", 00:17:48.819 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:48.819 "strip_size_kb": 64, 00:17:48.819 "state": "configuring", 00:17:48.819 "raid_level": "concat", 00:17:48.819 "superblock": true, 00:17:48.819 "num_base_bdevs": 4, 00:17:48.819 "num_base_bdevs_discovered": 1, 00:17:48.819 "num_base_bdevs_operational": 4, 00:17:48.819 "base_bdevs_list": [ 00:17:48.819 { 00:17:48.819 "name": "BaseBdev1", 00:17:48.819 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:48.819 "is_configured": true, 00:17:48.819 "data_offset": 2048, 00:17:48.819 "data_size": 63488 00:17:48.819 }, 00:17:48.819 { 00:17:48.819 "name": "BaseBdev2", 00:17:48.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.819 "is_configured": false, 00:17:48.819 "data_offset": 0, 00:17:48.819 "data_size": 0 00:17:48.819 }, 00:17:48.819 { 00:17:48.819 "name": "BaseBdev3", 00:17:48.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.819 "is_configured": false, 00:17:48.819 "data_offset": 0, 00:17:48.819 "data_size": 0 00:17:48.819 }, 00:17:48.819 { 00:17:48.819 "name": "BaseBdev4", 00:17:48.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.819 "is_configured": false, 00:17:48.819 "data_offset": 0, 00:17:48.819 "data_size": 0 00:17:48.819 } 00:17:48.819 ] 00:17:48.819 }' 00:17:48.819 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.819 10:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.081 10:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:49.342 [2024-06-10 10:13:11.028960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:49.342 BaseBdev2 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:49.342 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:49.603 [ 00:17:49.603 { 00:17:49.603 "name": "BaseBdev2", 00:17:49.603 "aliases": [ 00:17:49.603 "8dc06475-e562-4a78-a3db-5aef682fcfec" 00:17:49.603 ], 00:17:49.603 "product_name": "Malloc disk", 00:17:49.603 "block_size": 512, 00:17:49.603 "num_blocks": 65536, 00:17:49.603 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:49.603 "assigned_rate_limits": { 00:17:49.603 "rw_ios_per_sec": 0, 00:17:49.603 "rw_mbytes_per_sec": 0, 00:17:49.603 "r_mbytes_per_sec": 0, 00:17:49.603 "w_mbytes_per_sec": 0 00:17:49.603 }, 00:17:49.603 "claimed": true, 00:17:49.603 "claim_type": "exclusive_write", 00:17:49.603 "zoned": false, 00:17:49.603 "supported_io_types": { 00:17:49.603 "read": true, 00:17:49.603 "write": true, 00:17:49.603 "unmap": true, 00:17:49.603 "write_zeroes": true, 00:17:49.603 "flush": true, 00:17:49.603 "reset": true, 00:17:49.603 "compare": false, 00:17:49.603 "compare_and_write": false, 00:17:49.603 "abort": true, 00:17:49.603 "nvme_admin": false, 00:17:49.603 "nvme_io": false 00:17:49.603 }, 00:17:49.603 "memory_domains": [ 00:17:49.603 { 00:17:49.603 "dma_device_id": "system", 00:17:49.603 "dma_device_type": 1 00:17:49.603 }, 00:17:49.603 { 00:17:49.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.603 "dma_device_type": 2 00:17:49.603 } 00:17:49.603 ], 00:17:49.603 "driver_specific": {} 00:17:49.603 } 00:17:49.603 ] 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.603 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.865 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.865 "name": "Existed_Raid", 00:17:49.865 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:49.865 "strip_size_kb": 64, 00:17:49.865 "state": "configuring", 00:17:49.865 "raid_level": "concat", 00:17:49.865 "superblock": true, 00:17:49.865 "num_base_bdevs": 4, 00:17:49.865 "num_base_bdevs_discovered": 2, 00:17:49.865 "num_base_bdevs_operational": 4, 00:17:49.865 "base_bdevs_list": [ 00:17:49.865 { 00:17:49.865 "name": "BaseBdev1", 00:17:49.865 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:49.865 "is_configured": true, 00:17:49.865 "data_offset": 2048, 00:17:49.865 "data_size": 63488 00:17:49.865 }, 00:17:49.865 { 00:17:49.865 "name": "BaseBdev2", 00:17:49.865 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:49.865 "is_configured": true, 00:17:49.865 "data_offset": 2048, 00:17:49.865 "data_size": 63488 00:17:49.865 }, 00:17:49.865 { 00:17:49.865 "name": "BaseBdev3", 00:17:49.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.865 "is_configured": false, 00:17:49.865 "data_offset": 0, 00:17:49.865 "data_size": 0 00:17:49.865 }, 00:17:49.865 { 00:17:49.865 "name": "BaseBdev4", 00:17:49.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.865 "is_configured": false, 00:17:49.865 "data_offset": 0, 00:17:49.865 "data_size": 0 00:17:49.865 } 00:17:49.865 ] 00:17:49.865 }' 00:17:49.865 10:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.865 10:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:50.438 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:50.699 [2024-06-10 10:13:12.321106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:50.699 BaseBdev3 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:50.699 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:50.700 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:50.700 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:50.959 [ 00:17:50.959 { 00:17:50.959 "name": "BaseBdev3", 00:17:50.959 "aliases": [ 00:17:50.959 "e991f74d-21d6-4c0a-8a98-88f8ba863560" 00:17:50.959 ], 00:17:50.959 "product_name": "Malloc disk", 00:17:50.959 "block_size": 512, 00:17:50.959 "num_blocks": 65536, 00:17:50.959 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:50.959 "assigned_rate_limits": { 00:17:50.959 "rw_ios_per_sec": 0, 00:17:50.959 "rw_mbytes_per_sec": 0, 00:17:50.959 "r_mbytes_per_sec": 0, 00:17:50.959 "w_mbytes_per_sec": 0 00:17:50.959 }, 00:17:50.959 "claimed": true, 00:17:50.959 "claim_type": "exclusive_write", 00:17:50.959 "zoned": false, 00:17:50.959 "supported_io_types": { 00:17:50.959 "read": true, 00:17:50.959 "write": true, 00:17:50.959 "unmap": true, 00:17:50.959 "write_zeroes": true, 00:17:50.959 "flush": true, 00:17:50.959 "reset": true, 00:17:50.959 "compare": false, 00:17:50.959 "compare_and_write": false, 00:17:50.959 "abort": true, 00:17:50.959 "nvme_admin": false, 00:17:50.959 "nvme_io": false 00:17:50.959 }, 00:17:50.959 "memory_domains": [ 00:17:50.959 { 00:17:50.959 "dma_device_id": "system", 00:17:50.959 "dma_device_type": 1 00:17:50.959 }, 00:17:50.959 { 00:17:50.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.959 "dma_device_type": 2 00:17:50.959 } 00:17:50.959 ], 00:17:50.959 "driver_specific": {} 00:17:50.959 } 00:17:50.959 ] 00:17:50.959 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.960 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.221 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.221 "name": "Existed_Raid", 00:17:51.221 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:51.221 "strip_size_kb": 64, 00:17:51.221 "state": "configuring", 00:17:51.221 "raid_level": "concat", 00:17:51.221 "superblock": true, 00:17:51.221 "num_base_bdevs": 4, 00:17:51.221 "num_base_bdevs_discovered": 3, 00:17:51.221 "num_base_bdevs_operational": 4, 00:17:51.221 "base_bdevs_list": [ 00:17:51.221 { 00:17:51.221 "name": "BaseBdev1", 00:17:51.221 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:51.221 "is_configured": true, 00:17:51.221 "data_offset": 2048, 00:17:51.221 "data_size": 63488 00:17:51.221 }, 00:17:51.221 { 00:17:51.221 "name": "BaseBdev2", 00:17:51.221 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:51.221 "is_configured": true, 00:17:51.221 "data_offset": 2048, 00:17:51.221 "data_size": 63488 00:17:51.221 }, 00:17:51.221 { 00:17:51.221 "name": "BaseBdev3", 00:17:51.221 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:51.221 "is_configured": true, 00:17:51.221 "data_offset": 2048, 00:17:51.221 "data_size": 63488 00:17:51.221 }, 00:17:51.221 { 00:17:51.221 "name": "BaseBdev4", 00:17:51.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.221 "is_configured": false, 00:17:51.221 "data_offset": 0, 00:17:51.221 "data_size": 0 00:17:51.221 } 00:17:51.221 ] 00:17:51.221 }' 00:17:51.221 10:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.221 10:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:51.793 [2024-06-10 10:13:13.569215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:51.793 [2024-06-10 10:13:13.569340] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27954c0 00:17:51.793 [2024-06-10 10:13:13.569348] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:51.793 [2024-06-10 10:13:13.569489] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2946820 00:17:51.793 [2024-06-10 10:13:13.569578] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27954c0 00:17:51.793 [2024-06-10 10:13:13.569583] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27954c0 00:17:51.793 [2024-06-10 10:13:13.569650] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:51.793 BaseBdev4 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:51.793 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.054 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:52.315 [ 00:17:52.315 { 00:17:52.315 "name": "BaseBdev4", 00:17:52.315 "aliases": [ 00:17:52.315 "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb" 00:17:52.315 ], 00:17:52.315 "product_name": "Malloc disk", 00:17:52.315 "block_size": 512, 00:17:52.315 "num_blocks": 65536, 00:17:52.315 "uuid": "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb", 00:17:52.315 "assigned_rate_limits": { 00:17:52.315 "rw_ios_per_sec": 0, 00:17:52.315 "rw_mbytes_per_sec": 0, 00:17:52.315 "r_mbytes_per_sec": 0, 00:17:52.315 "w_mbytes_per_sec": 0 00:17:52.315 }, 00:17:52.315 "claimed": true, 00:17:52.315 "claim_type": "exclusive_write", 00:17:52.315 "zoned": false, 00:17:52.315 "supported_io_types": { 00:17:52.315 "read": true, 00:17:52.315 "write": true, 00:17:52.315 "unmap": true, 00:17:52.315 "write_zeroes": true, 00:17:52.315 "flush": true, 00:17:52.315 "reset": true, 00:17:52.315 "compare": false, 00:17:52.315 "compare_and_write": false, 00:17:52.315 "abort": true, 00:17:52.315 "nvme_admin": false, 00:17:52.315 "nvme_io": false 00:17:52.315 }, 00:17:52.315 "memory_domains": [ 00:17:52.315 { 00:17:52.315 "dma_device_id": "system", 00:17:52.315 "dma_device_type": 1 00:17:52.315 }, 00:17:52.315 { 00:17:52.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.315 "dma_device_type": 2 00:17:52.315 } 00:17:52.315 ], 00:17:52.315 "driver_specific": {} 00:17:52.315 } 00:17:52.315 ] 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.315 10:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.315 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.315 "name": "Existed_Raid", 00:17:52.315 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:52.315 "strip_size_kb": 64, 00:17:52.315 "state": "online", 00:17:52.315 "raid_level": "concat", 00:17:52.315 "superblock": true, 00:17:52.315 "num_base_bdevs": 4, 00:17:52.315 "num_base_bdevs_discovered": 4, 00:17:52.315 "num_base_bdevs_operational": 4, 00:17:52.315 "base_bdevs_list": [ 00:17:52.315 { 00:17:52.315 "name": "BaseBdev1", 00:17:52.315 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:52.315 "is_configured": true, 00:17:52.315 "data_offset": 2048, 00:17:52.316 "data_size": 63488 00:17:52.316 }, 00:17:52.316 { 00:17:52.316 "name": "BaseBdev2", 00:17:52.316 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:52.316 "is_configured": true, 00:17:52.316 "data_offset": 2048, 00:17:52.316 "data_size": 63488 00:17:52.316 }, 00:17:52.316 { 00:17:52.316 "name": "BaseBdev3", 00:17:52.316 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:52.316 "is_configured": true, 00:17:52.316 "data_offset": 2048, 00:17:52.316 "data_size": 63488 00:17:52.316 }, 00:17:52.316 { 00:17:52.316 "name": "BaseBdev4", 00:17:52.316 "uuid": "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb", 00:17:52.316 "is_configured": true, 00:17:52.316 "data_offset": 2048, 00:17:52.316 "data_size": 63488 00:17:52.316 } 00:17:52.316 ] 00:17:52.316 }' 00:17:52.316 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.316 10:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:52.887 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:52.888 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:53.148 [2024-06-10 10:13:14.800564] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:53.148 "name": "Existed_Raid", 00:17:53.148 "aliases": [ 00:17:53.148 "a933d211-8b73-43e9-a8a7-28b4d581e807" 00:17:53.148 ], 00:17:53.148 "product_name": "Raid Volume", 00:17:53.148 "block_size": 512, 00:17:53.148 "num_blocks": 253952, 00:17:53.148 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:53.148 "assigned_rate_limits": { 00:17:53.148 "rw_ios_per_sec": 0, 00:17:53.148 "rw_mbytes_per_sec": 0, 00:17:53.148 "r_mbytes_per_sec": 0, 00:17:53.148 "w_mbytes_per_sec": 0 00:17:53.148 }, 00:17:53.148 "claimed": false, 00:17:53.148 "zoned": false, 00:17:53.148 "supported_io_types": { 00:17:53.148 "read": true, 00:17:53.148 "write": true, 00:17:53.148 "unmap": true, 00:17:53.148 "write_zeroes": true, 00:17:53.148 "flush": true, 00:17:53.148 "reset": true, 00:17:53.148 "compare": false, 00:17:53.148 "compare_and_write": false, 00:17:53.148 "abort": false, 00:17:53.148 "nvme_admin": false, 00:17:53.148 "nvme_io": false 00:17:53.148 }, 00:17:53.148 "memory_domains": [ 00:17:53.148 { 00:17:53.148 "dma_device_id": "system", 00:17:53.148 "dma_device_type": 1 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.148 "dma_device_type": 2 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "system", 00:17:53.148 "dma_device_type": 1 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.148 "dma_device_type": 2 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "system", 00:17:53.148 "dma_device_type": 1 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.148 "dma_device_type": 2 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "system", 00:17:53.148 "dma_device_type": 1 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.148 "dma_device_type": 2 00:17:53.148 } 00:17:53.148 ], 00:17:53.148 "driver_specific": { 00:17:53.148 "raid": { 00:17:53.148 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:53.148 "strip_size_kb": 64, 00:17:53.148 "state": "online", 00:17:53.148 "raid_level": "concat", 00:17:53.148 "superblock": true, 00:17:53.148 "num_base_bdevs": 4, 00:17:53.148 "num_base_bdevs_discovered": 4, 00:17:53.148 "num_base_bdevs_operational": 4, 00:17:53.148 "base_bdevs_list": [ 00:17:53.148 { 00:17:53.148 "name": "BaseBdev1", 00:17:53.148 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:53.148 "is_configured": true, 00:17:53.148 "data_offset": 2048, 00:17:53.148 "data_size": 63488 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "name": "BaseBdev2", 00:17:53.148 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:53.148 "is_configured": true, 00:17:53.148 "data_offset": 2048, 00:17:53.148 "data_size": 63488 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "name": "BaseBdev3", 00:17:53.148 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:53.148 "is_configured": true, 00:17:53.148 "data_offset": 2048, 00:17:53.148 "data_size": 63488 00:17:53.148 }, 00:17:53.148 { 00:17:53.148 "name": "BaseBdev4", 00:17:53.148 "uuid": "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb", 00:17:53.148 "is_configured": true, 00:17:53.148 "data_offset": 2048, 00:17:53.148 "data_size": 63488 00:17:53.148 } 00:17:53.148 ] 00:17:53.148 } 00:17:53.148 } 00:17:53.148 }' 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:53.148 BaseBdev2 00:17:53.148 BaseBdev3 00:17:53.148 BaseBdev4' 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:53.148 10:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.409 "name": "BaseBdev1", 00:17:53.409 "aliases": [ 00:17:53.409 "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca" 00:17:53.409 ], 00:17:53.409 "product_name": "Malloc disk", 00:17:53.409 "block_size": 512, 00:17:53.409 "num_blocks": 65536, 00:17:53.409 "uuid": "42bf26c4-1d6e-4bf6-9f04-c7d049d0ecca", 00:17:53.409 "assigned_rate_limits": { 00:17:53.409 "rw_ios_per_sec": 0, 00:17:53.409 "rw_mbytes_per_sec": 0, 00:17:53.409 "r_mbytes_per_sec": 0, 00:17:53.409 "w_mbytes_per_sec": 0 00:17:53.409 }, 00:17:53.409 "claimed": true, 00:17:53.409 "claim_type": "exclusive_write", 00:17:53.409 "zoned": false, 00:17:53.409 "supported_io_types": { 00:17:53.409 "read": true, 00:17:53.409 "write": true, 00:17:53.409 "unmap": true, 00:17:53.409 "write_zeroes": true, 00:17:53.409 "flush": true, 00:17:53.409 "reset": true, 00:17:53.409 "compare": false, 00:17:53.409 "compare_and_write": false, 00:17:53.409 "abort": true, 00:17:53.409 "nvme_admin": false, 00:17:53.409 "nvme_io": false 00:17:53.409 }, 00:17:53.409 "memory_domains": [ 00:17:53.409 { 00:17:53.409 "dma_device_id": "system", 00:17:53.409 "dma_device_type": 1 00:17:53.409 }, 00:17:53.409 { 00:17:53.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.409 "dma_device_type": 2 00:17:53.409 } 00:17:53.409 ], 00:17:53.409 "driver_specific": {} 00:17:53.409 }' 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.409 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:53.671 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.931 "name": "BaseBdev2", 00:17:53.931 "aliases": [ 00:17:53.931 "8dc06475-e562-4a78-a3db-5aef682fcfec" 00:17:53.931 ], 00:17:53.931 "product_name": "Malloc disk", 00:17:53.931 "block_size": 512, 00:17:53.931 "num_blocks": 65536, 00:17:53.931 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:53.931 "assigned_rate_limits": { 00:17:53.931 "rw_ios_per_sec": 0, 00:17:53.931 "rw_mbytes_per_sec": 0, 00:17:53.931 "r_mbytes_per_sec": 0, 00:17:53.931 "w_mbytes_per_sec": 0 00:17:53.931 }, 00:17:53.931 "claimed": true, 00:17:53.931 "claim_type": "exclusive_write", 00:17:53.931 "zoned": false, 00:17:53.931 "supported_io_types": { 00:17:53.931 "read": true, 00:17:53.931 "write": true, 00:17:53.931 "unmap": true, 00:17:53.931 "write_zeroes": true, 00:17:53.931 "flush": true, 00:17:53.931 "reset": true, 00:17:53.931 "compare": false, 00:17:53.931 "compare_and_write": false, 00:17:53.931 "abort": true, 00:17:53.931 "nvme_admin": false, 00:17:53.931 "nvme_io": false 00:17:53.931 }, 00:17:53.931 "memory_domains": [ 00:17:53.931 { 00:17:53.931 "dma_device_id": "system", 00:17:53.931 "dma_device_type": 1 00:17:53.931 }, 00:17:53.931 { 00:17:53.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.931 "dma_device_type": 2 00:17:53.931 } 00:17:53.931 ], 00:17:53.931 "driver_specific": {} 00:17:53.931 }' 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.931 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.191 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.191 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.191 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.192 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.192 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.192 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:54.192 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:54.192 10:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.453 "name": "BaseBdev3", 00:17:54.453 "aliases": [ 00:17:54.453 "e991f74d-21d6-4c0a-8a98-88f8ba863560" 00:17:54.453 ], 00:17:54.453 "product_name": "Malloc disk", 00:17:54.453 "block_size": 512, 00:17:54.453 "num_blocks": 65536, 00:17:54.453 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:54.453 "assigned_rate_limits": { 00:17:54.453 "rw_ios_per_sec": 0, 00:17:54.453 "rw_mbytes_per_sec": 0, 00:17:54.453 "r_mbytes_per_sec": 0, 00:17:54.453 "w_mbytes_per_sec": 0 00:17:54.453 }, 00:17:54.453 "claimed": true, 00:17:54.453 "claim_type": "exclusive_write", 00:17:54.453 "zoned": false, 00:17:54.453 "supported_io_types": { 00:17:54.453 "read": true, 00:17:54.453 "write": true, 00:17:54.453 "unmap": true, 00:17:54.453 "write_zeroes": true, 00:17:54.453 "flush": true, 00:17:54.453 "reset": true, 00:17:54.453 "compare": false, 00:17:54.453 "compare_and_write": false, 00:17:54.453 "abort": true, 00:17:54.453 "nvme_admin": false, 00:17:54.453 "nvme_io": false 00:17:54.453 }, 00:17:54.453 "memory_domains": [ 00:17:54.453 { 00:17:54.453 "dma_device_id": "system", 00:17:54.453 "dma_device_type": 1 00:17:54.453 }, 00:17:54.453 { 00:17:54.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.453 "dma_device_type": 2 00:17:54.453 } 00:17:54.453 ], 00:17:54.453 "driver_specific": {} 00:17:54.453 }' 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.453 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:54.714 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.974 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.974 "name": "BaseBdev4", 00:17:54.974 "aliases": [ 00:17:54.974 "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb" 00:17:54.974 ], 00:17:54.974 "product_name": "Malloc disk", 00:17:54.975 "block_size": 512, 00:17:54.975 "num_blocks": 65536, 00:17:54.975 "uuid": "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb", 00:17:54.975 "assigned_rate_limits": { 00:17:54.975 "rw_ios_per_sec": 0, 00:17:54.975 "rw_mbytes_per_sec": 0, 00:17:54.975 "r_mbytes_per_sec": 0, 00:17:54.975 "w_mbytes_per_sec": 0 00:17:54.975 }, 00:17:54.975 "claimed": true, 00:17:54.975 "claim_type": "exclusive_write", 00:17:54.975 "zoned": false, 00:17:54.975 "supported_io_types": { 00:17:54.975 "read": true, 00:17:54.975 "write": true, 00:17:54.975 "unmap": true, 00:17:54.975 "write_zeroes": true, 00:17:54.975 "flush": true, 00:17:54.975 "reset": true, 00:17:54.975 "compare": false, 00:17:54.975 "compare_and_write": false, 00:17:54.975 "abort": true, 00:17:54.975 "nvme_admin": false, 00:17:54.975 "nvme_io": false 00:17:54.975 }, 00:17:54.975 "memory_domains": [ 00:17:54.975 { 00:17:54.975 "dma_device_id": "system", 00:17:54.975 "dma_device_type": 1 00:17:54.975 }, 00:17:54.975 { 00:17:54.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.975 "dma_device_type": 2 00:17:54.975 } 00:17:54.975 ], 00:17:54.975 "driver_specific": {} 00:17:54.975 }' 00:17:54.975 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.975 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.975 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.975 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.975 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:55.235 10:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:55.496 [2024-06-10 10:13:17.174377] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:55.496 [2024-06-10 10:13:17.174395] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:55.496 [2024-06-10 10:13:17.174433] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.496 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.758 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.758 "name": "Existed_Raid", 00:17:55.758 "uuid": "a933d211-8b73-43e9-a8a7-28b4d581e807", 00:17:55.758 "strip_size_kb": 64, 00:17:55.758 "state": "offline", 00:17:55.758 "raid_level": "concat", 00:17:55.758 "superblock": true, 00:17:55.758 "num_base_bdevs": 4, 00:17:55.758 "num_base_bdevs_discovered": 3, 00:17:55.758 "num_base_bdevs_operational": 3, 00:17:55.758 "base_bdevs_list": [ 00:17:55.758 { 00:17:55.758 "name": null, 00:17:55.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.758 "is_configured": false, 00:17:55.758 "data_offset": 2048, 00:17:55.758 "data_size": 63488 00:17:55.758 }, 00:17:55.758 { 00:17:55.758 "name": "BaseBdev2", 00:17:55.758 "uuid": "8dc06475-e562-4a78-a3db-5aef682fcfec", 00:17:55.758 "is_configured": true, 00:17:55.758 "data_offset": 2048, 00:17:55.758 "data_size": 63488 00:17:55.758 }, 00:17:55.758 { 00:17:55.758 "name": "BaseBdev3", 00:17:55.758 "uuid": "e991f74d-21d6-4c0a-8a98-88f8ba863560", 00:17:55.758 "is_configured": true, 00:17:55.758 "data_offset": 2048, 00:17:55.758 "data_size": 63488 00:17:55.758 }, 00:17:55.758 { 00:17:55.758 "name": "BaseBdev4", 00:17:55.758 "uuid": "17c5a5d1-b017-4f7d-a645-c5ca43dbc4cb", 00:17:55.758 "is_configured": true, 00:17:55.758 "data_offset": 2048, 00:17:55.758 "data_size": 63488 00:17:55.758 } 00:17:55.758 ] 00:17:55.758 }' 00:17:55.758 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.758 10:13:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.329 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:56.329 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:56.329 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.329 10:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:56.329 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:56.329 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:56.329 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:56.590 [2024-06-10 10:13:18.269148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:56.590 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:56.590 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:56.590 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.590 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:56.851 [2024-06-10 10:13:18.656000] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.851 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:57.112 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:57.112 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:57.112 10:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:57.373 [2024-06-10 10:13:19.042747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:57.373 [2024-06-10 10:13:19.042773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27954c0 name Existed_Raid, state offline 00:17:57.373 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:57.373 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:57.373 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.373 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:57.634 BaseBdev2 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:57.634 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.895 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:58.156 [ 00:17:58.156 { 00:17:58.156 "name": "BaseBdev2", 00:17:58.156 "aliases": [ 00:17:58.156 "8f14a589-d017-4621-bbb9-50030a948044" 00:17:58.156 ], 00:17:58.156 "product_name": "Malloc disk", 00:17:58.156 "block_size": 512, 00:17:58.156 "num_blocks": 65536, 00:17:58.156 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:17:58.156 "assigned_rate_limits": { 00:17:58.156 "rw_ios_per_sec": 0, 00:17:58.156 "rw_mbytes_per_sec": 0, 00:17:58.156 "r_mbytes_per_sec": 0, 00:17:58.156 "w_mbytes_per_sec": 0 00:17:58.156 }, 00:17:58.156 "claimed": false, 00:17:58.156 "zoned": false, 00:17:58.156 "supported_io_types": { 00:17:58.156 "read": true, 00:17:58.156 "write": true, 00:17:58.156 "unmap": true, 00:17:58.156 "write_zeroes": true, 00:17:58.156 "flush": true, 00:17:58.156 "reset": true, 00:17:58.156 "compare": false, 00:17:58.156 "compare_and_write": false, 00:17:58.156 "abort": true, 00:17:58.156 "nvme_admin": false, 00:17:58.156 "nvme_io": false 00:17:58.156 }, 00:17:58.156 "memory_domains": [ 00:17:58.156 { 00:17:58.156 "dma_device_id": "system", 00:17:58.156 "dma_device_type": 1 00:17:58.156 }, 00:17:58.156 { 00:17:58.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.156 "dma_device_type": 2 00:17:58.156 } 00:17:58.156 ], 00:17:58.156 "driver_specific": {} 00:17:58.156 } 00:17:58.156 ] 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:58.156 BaseBdev3 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:58.156 10:13:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.417 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:58.678 [ 00:17:58.678 { 00:17:58.678 "name": "BaseBdev3", 00:17:58.678 "aliases": [ 00:17:58.678 "345fdae6-0385-4269-a977-8e39d76cc238" 00:17:58.678 ], 00:17:58.678 "product_name": "Malloc disk", 00:17:58.678 "block_size": 512, 00:17:58.678 "num_blocks": 65536, 00:17:58.678 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:17:58.678 "assigned_rate_limits": { 00:17:58.678 "rw_ios_per_sec": 0, 00:17:58.678 "rw_mbytes_per_sec": 0, 00:17:58.678 "r_mbytes_per_sec": 0, 00:17:58.678 "w_mbytes_per_sec": 0 00:17:58.678 }, 00:17:58.678 "claimed": false, 00:17:58.678 "zoned": false, 00:17:58.678 "supported_io_types": { 00:17:58.678 "read": true, 00:17:58.678 "write": true, 00:17:58.678 "unmap": true, 00:17:58.678 "write_zeroes": true, 00:17:58.678 "flush": true, 00:17:58.678 "reset": true, 00:17:58.678 "compare": false, 00:17:58.678 "compare_and_write": false, 00:17:58.678 "abort": true, 00:17:58.678 "nvme_admin": false, 00:17:58.678 "nvme_io": false 00:17:58.678 }, 00:17:58.678 "memory_domains": [ 00:17:58.678 { 00:17:58.678 "dma_device_id": "system", 00:17:58.678 "dma_device_type": 1 00:17:58.678 }, 00:17:58.678 { 00:17:58.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.678 "dma_device_type": 2 00:17:58.678 } 00:17:58.678 ], 00:17:58.678 "driver_specific": {} 00:17:58.678 } 00:17:58.678 ] 00:17:58.678 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:58.678 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:58.678 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:58.678 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:58.939 BaseBdev4 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.939 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:59.201 [ 00:17:59.201 { 00:17:59.201 "name": "BaseBdev4", 00:17:59.201 "aliases": [ 00:17:59.201 "0ec42772-44d3-4cca-90c8-6092b4863f28" 00:17:59.201 ], 00:17:59.201 "product_name": "Malloc disk", 00:17:59.201 "block_size": 512, 00:17:59.201 "num_blocks": 65536, 00:17:59.201 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:17:59.201 "assigned_rate_limits": { 00:17:59.201 "rw_ios_per_sec": 0, 00:17:59.201 "rw_mbytes_per_sec": 0, 00:17:59.201 "r_mbytes_per_sec": 0, 00:17:59.201 "w_mbytes_per_sec": 0 00:17:59.201 }, 00:17:59.201 "claimed": false, 00:17:59.201 "zoned": false, 00:17:59.201 "supported_io_types": { 00:17:59.201 "read": true, 00:17:59.201 "write": true, 00:17:59.201 "unmap": true, 00:17:59.201 "write_zeroes": true, 00:17:59.201 "flush": true, 00:17:59.201 "reset": true, 00:17:59.201 "compare": false, 00:17:59.201 "compare_and_write": false, 00:17:59.201 "abort": true, 00:17:59.201 "nvme_admin": false, 00:17:59.201 "nvme_io": false 00:17:59.201 }, 00:17:59.201 "memory_domains": [ 00:17:59.201 { 00:17:59.201 "dma_device_id": "system", 00:17:59.201 "dma_device_type": 1 00:17:59.201 }, 00:17:59.201 { 00:17:59.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.201 "dma_device_type": 2 00:17:59.201 } 00:17:59.201 ], 00:17:59.201 "driver_specific": {} 00:17:59.201 } 00:17:59.201 ] 00:17:59.201 10:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:59.201 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:59.201 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:59.201 10:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:59.462 [2024-06-10 10:13:21.117657] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:59.462 [2024-06-10 10:13:21.117687] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:59.462 [2024-06-10 10:13:21.117699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:59.462 [2024-06-10 10:13:21.118731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:59.462 [2024-06-10 10:13:21.118762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.462 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.462 "name": "Existed_Raid", 00:17:59.462 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:17:59.462 "strip_size_kb": 64, 00:17:59.462 "state": "configuring", 00:17:59.462 "raid_level": "concat", 00:17:59.462 "superblock": true, 00:17:59.462 "num_base_bdevs": 4, 00:17:59.462 "num_base_bdevs_discovered": 3, 00:17:59.462 "num_base_bdevs_operational": 4, 00:17:59.462 "base_bdevs_list": [ 00:17:59.462 { 00:17:59.462 "name": "BaseBdev1", 00:17:59.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.462 "is_configured": false, 00:17:59.462 "data_offset": 0, 00:17:59.462 "data_size": 0 00:17:59.463 }, 00:17:59.463 { 00:17:59.463 "name": "BaseBdev2", 00:17:59.463 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:17:59.463 "is_configured": true, 00:17:59.463 "data_offset": 2048, 00:17:59.463 "data_size": 63488 00:17:59.463 }, 00:17:59.463 { 00:17:59.463 "name": "BaseBdev3", 00:17:59.463 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:17:59.463 "is_configured": true, 00:17:59.463 "data_offset": 2048, 00:17:59.463 "data_size": 63488 00:17:59.463 }, 00:17:59.463 { 00:17:59.463 "name": "BaseBdev4", 00:17:59.463 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:17:59.463 "is_configured": true, 00:17:59.463 "data_offset": 2048, 00:17:59.463 "data_size": 63488 00:17:59.463 } 00:17:59.463 ] 00:17:59.463 }' 00:17:59.463 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.463 10:13:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.034 10:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:00.296 [2024-06-10 10:13:22.031938] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.296 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.557 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.557 "name": "Existed_Raid", 00:18:00.557 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:00.557 "strip_size_kb": 64, 00:18:00.557 "state": "configuring", 00:18:00.557 "raid_level": "concat", 00:18:00.558 "superblock": true, 00:18:00.558 "num_base_bdevs": 4, 00:18:00.558 "num_base_bdevs_discovered": 2, 00:18:00.558 "num_base_bdevs_operational": 4, 00:18:00.558 "base_bdevs_list": [ 00:18:00.558 { 00:18:00.558 "name": "BaseBdev1", 00:18:00.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.558 "is_configured": false, 00:18:00.558 "data_offset": 0, 00:18:00.558 "data_size": 0 00:18:00.558 }, 00:18:00.558 { 00:18:00.558 "name": null, 00:18:00.558 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:00.558 "is_configured": false, 00:18:00.558 "data_offset": 2048, 00:18:00.558 "data_size": 63488 00:18:00.558 }, 00:18:00.558 { 00:18:00.558 "name": "BaseBdev3", 00:18:00.558 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:00.558 "is_configured": true, 00:18:00.558 "data_offset": 2048, 00:18:00.558 "data_size": 63488 00:18:00.558 }, 00:18:00.558 { 00:18:00.558 "name": "BaseBdev4", 00:18:00.558 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:00.558 "is_configured": true, 00:18:00.558 "data_offset": 2048, 00:18:00.558 "data_size": 63488 00:18:00.558 } 00:18:00.558 ] 00:18:00.558 }' 00:18:00.558 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.558 10:13:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.130 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.130 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:01.130 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:01.130 10:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:01.391 [2024-06-10 10:13:23.167694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:01.391 BaseBdev1 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:01.391 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:01.652 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:01.913 [ 00:18:01.914 { 00:18:01.914 "name": "BaseBdev1", 00:18:01.914 "aliases": [ 00:18:01.914 "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6" 00:18:01.914 ], 00:18:01.914 "product_name": "Malloc disk", 00:18:01.914 "block_size": 512, 00:18:01.914 "num_blocks": 65536, 00:18:01.914 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:01.914 "assigned_rate_limits": { 00:18:01.914 "rw_ios_per_sec": 0, 00:18:01.914 "rw_mbytes_per_sec": 0, 00:18:01.914 "r_mbytes_per_sec": 0, 00:18:01.914 "w_mbytes_per_sec": 0 00:18:01.914 }, 00:18:01.914 "claimed": true, 00:18:01.914 "claim_type": "exclusive_write", 00:18:01.914 "zoned": false, 00:18:01.914 "supported_io_types": { 00:18:01.914 "read": true, 00:18:01.914 "write": true, 00:18:01.914 "unmap": true, 00:18:01.914 "write_zeroes": true, 00:18:01.914 "flush": true, 00:18:01.914 "reset": true, 00:18:01.914 "compare": false, 00:18:01.914 "compare_and_write": false, 00:18:01.914 "abort": true, 00:18:01.914 "nvme_admin": false, 00:18:01.914 "nvme_io": false 00:18:01.914 }, 00:18:01.914 "memory_domains": [ 00:18:01.914 { 00:18:01.914 "dma_device_id": "system", 00:18:01.914 "dma_device_type": 1 00:18:01.914 }, 00:18:01.914 { 00:18:01.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.914 "dma_device_type": 2 00:18:01.914 } 00:18:01.914 ], 00:18:01.914 "driver_specific": {} 00:18:01.914 } 00:18:01.914 ] 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.914 "name": "Existed_Raid", 00:18:01.914 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:01.914 "strip_size_kb": 64, 00:18:01.914 "state": "configuring", 00:18:01.914 "raid_level": "concat", 00:18:01.914 "superblock": true, 00:18:01.914 "num_base_bdevs": 4, 00:18:01.914 "num_base_bdevs_discovered": 3, 00:18:01.914 "num_base_bdevs_operational": 4, 00:18:01.914 "base_bdevs_list": [ 00:18:01.914 { 00:18:01.914 "name": "BaseBdev1", 00:18:01.914 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:01.914 "is_configured": true, 00:18:01.914 "data_offset": 2048, 00:18:01.914 "data_size": 63488 00:18:01.914 }, 00:18:01.914 { 00:18:01.914 "name": null, 00:18:01.914 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:01.914 "is_configured": false, 00:18:01.914 "data_offset": 2048, 00:18:01.914 "data_size": 63488 00:18:01.914 }, 00:18:01.914 { 00:18:01.914 "name": "BaseBdev3", 00:18:01.914 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:01.914 "is_configured": true, 00:18:01.914 "data_offset": 2048, 00:18:01.914 "data_size": 63488 00:18:01.914 }, 00:18:01.914 { 00:18:01.914 "name": "BaseBdev4", 00:18:01.914 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:01.914 "is_configured": true, 00:18:01.914 "data_offset": 2048, 00:18:01.914 "data_size": 63488 00:18:01.914 } 00:18:01.914 ] 00:18:01.914 }' 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.914 10:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.520 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.520 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:02.794 [2024-06-10 10:13:24.579282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.794 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.064 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.064 "name": "Existed_Raid", 00:18:03.064 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:03.064 "strip_size_kb": 64, 00:18:03.064 "state": "configuring", 00:18:03.064 "raid_level": "concat", 00:18:03.064 "superblock": true, 00:18:03.064 "num_base_bdevs": 4, 00:18:03.064 "num_base_bdevs_discovered": 2, 00:18:03.064 "num_base_bdevs_operational": 4, 00:18:03.064 "base_bdevs_list": [ 00:18:03.064 { 00:18:03.064 "name": "BaseBdev1", 00:18:03.064 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:03.064 "is_configured": true, 00:18:03.064 "data_offset": 2048, 00:18:03.064 "data_size": 63488 00:18:03.064 }, 00:18:03.064 { 00:18:03.064 "name": null, 00:18:03.064 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:03.064 "is_configured": false, 00:18:03.064 "data_offset": 2048, 00:18:03.064 "data_size": 63488 00:18:03.064 }, 00:18:03.064 { 00:18:03.064 "name": null, 00:18:03.064 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:03.064 "is_configured": false, 00:18:03.064 "data_offset": 2048, 00:18:03.064 "data_size": 63488 00:18:03.064 }, 00:18:03.064 { 00:18:03.064 "name": "BaseBdev4", 00:18:03.064 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:03.064 "is_configured": true, 00:18:03.064 "data_offset": 2048, 00:18:03.064 "data_size": 63488 00:18:03.064 } 00:18:03.064 ] 00:18:03.064 }' 00:18:03.064 10:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.064 10:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:03.634 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.634 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:03.894 [2024-06-10 10:13:25.706152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.894 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.154 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.154 "name": "Existed_Raid", 00:18:04.155 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:04.155 "strip_size_kb": 64, 00:18:04.155 "state": "configuring", 00:18:04.155 "raid_level": "concat", 00:18:04.155 "superblock": true, 00:18:04.155 "num_base_bdevs": 4, 00:18:04.155 "num_base_bdevs_discovered": 3, 00:18:04.155 "num_base_bdevs_operational": 4, 00:18:04.155 "base_bdevs_list": [ 00:18:04.155 { 00:18:04.155 "name": "BaseBdev1", 00:18:04.155 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:04.155 "is_configured": true, 00:18:04.155 "data_offset": 2048, 00:18:04.155 "data_size": 63488 00:18:04.155 }, 00:18:04.155 { 00:18:04.155 "name": null, 00:18:04.155 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:04.155 "is_configured": false, 00:18:04.155 "data_offset": 2048, 00:18:04.155 "data_size": 63488 00:18:04.155 }, 00:18:04.155 { 00:18:04.155 "name": "BaseBdev3", 00:18:04.155 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:04.155 "is_configured": true, 00:18:04.155 "data_offset": 2048, 00:18:04.155 "data_size": 63488 00:18:04.155 }, 00:18:04.155 { 00:18:04.155 "name": "BaseBdev4", 00:18:04.155 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:04.155 "is_configured": true, 00:18:04.155 "data_offset": 2048, 00:18:04.155 "data_size": 63488 00:18:04.155 } 00:18:04.155 ] 00:18:04.155 }' 00:18:04.155 10:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.155 10:13:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.726 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.726 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:04.988 [2024-06-10 10:13:26.796927] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.988 10:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.248 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.248 "name": "Existed_Raid", 00:18:05.248 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:05.248 "strip_size_kb": 64, 00:18:05.248 "state": "configuring", 00:18:05.248 "raid_level": "concat", 00:18:05.248 "superblock": true, 00:18:05.248 "num_base_bdevs": 4, 00:18:05.248 "num_base_bdevs_discovered": 2, 00:18:05.248 "num_base_bdevs_operational": 4, 00:18:05.248 "base_bdevs_list": [ 00:18:05.248 { 00:18:05.248 "name": null, 00:18:05.248 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:05.248 "is_configured": false, 00:18:05.248 "data_offset": 2048, 00:18:05.248 "data_size": 63488 00:18:05.248 }, 00:18:05.248 { 00:18:05.248 "name": null, 00:18:05.248 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:05.248 "is_configured": false, 00:18:05.248 "data_offset": 2048, 00:18:05.248 "data_size": 63488 00:18:05.248 }, 00:18:05.248 { 00:18:05.248 "name": "BaseBdev3", 00:18:05.248 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:05.248 "is_configured": true, 00:18:05.248 "data_offset": 2048, 00:18:05.248 "data_size": 63488 00:18:05.248 }, 00:18:05.248 { 00:18:05.248 "name": "BaseBdev4", 00:18:05.248 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:05.248 "is_configured": true, 00:18:05.248 "data_offset": 2048, 00:18:05.248 "data_size": 63488 00:18:05.248 } 00:18:05.248 ] 00:18:05.248 }' 00:18:05.248 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.248 10:13:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:05.818 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.818 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:06.077 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:06.077 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:06.077 [2024-06-10 10:13:27.921547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.337 10:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.337 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.337 "name": "Existed_Raid", 00:18:06.337 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:06.337 "strip_size_kb": 64, 00:18:06.337 "state": "configuring", 00:18:06.337 "raid_level": "concat", 00:18:06.337 "superblock": true, 00:18:06.337 "num_base_bdevs": 4, 00:18:06.337 "num_base_bdevs_discovered": 3, 00:18:06.337 "num_base_bdevs_operational": 4, 00:18:06.337 "base_bdevs_list": [ 00:18:06.337 { 00:18:06.337 "name": null, 00:18:06.337 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:06.337 "is_configured": false, 00:18:06.337 "data_offset": 2048, 00:18:06.337 "data_size": 63488 00:18:06.337 }, 00:18:06.337 { 00:18:06.337 "name": "BaseBdev2", 00:18:06.337 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:06.337 "is_configured": true, 00:18:06.337 "data_offset": 2048, 00:18:06.337 "data_size": 63488 00:18:06.337 }, 00:18:06.337 { 00:18:06.337 "name": "BaseBdev3", 00:18:06.337 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:06.337 "is_configured": true, 00:18:06.337 "data_offset": 2048, 00:18:06.337 "data_size": 63488 00:18:06.337 }, 00:18:06.337 { 00:18:06.337 "name": "BaseBdev4", 00:18:06.337 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:06.337 "is_configured": true, 00:18:06.337 "data_offset": 2048, 00:18:06.337 "data_size": 63488 00:18:06.337 } 00:18:06.337 ] 00:18:06.337 }' 00:18:06.337 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.337 10:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.908 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.908 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:07.168 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:07.168 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.168 10:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6 00:18:07.429 [2024-06-10 10:13:29.241872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:07.429 [2024-06-10 10:13:29.241984] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27950f0 00:18:07.429 [2024-06-10 10:13:29.241991] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:07.429 [2024-06-10 10:13:29.242129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2492240 00:18:07.429 [2024-06-10 10:13:29.242217] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27950f0 00:18:07.429 [2024-06-10 10:13:29.242222] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27950f0 00:18:07.429 [2024-06-10 10:13:29.242287] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:07.429 NewBaseBdev 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:07.429 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.689 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:07.950 [ 00:18:07.950 { 00:18:07.950 "name": "NewBaseBdev", 00:18:07.950 "aliases": [ 00:18:07.950 "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6" 00:18:07.950 ], 00:18:07.950 "product_name": "Malloc disk", 00:18:07.950 "block_size": 512, 00:18:07.950 "num_blocks": 65536, 00:18:07.950 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:07.950 "assigned_rate_limits": { 00:18:07.950 "rw_ios_per_sec": 0, 00:18:07.950 "rw_mbytes_per_sec": 0, 00:18:07.950 "r_mbytes_per_sec": 0, 00:18:07.950 "w_mbytes_per_sec": 0 00:18:07.950 }, 00:18:07.950 "claimed": true, 00:18:07.950 "claim_type": "exclusive_write", 00:18:07.950 "zoned": false, 00:18:07.950 "supported_io_types": { 00:18:07.950 "read": true, 00:18:07.950 "write": true, 00:18:07.950 "unmap": true, 00:18:07.950 "write_zeroes": true, 00:18:07.950 "flush": true, 00:18:07.950 "reset": true, 00:18:07.950 "compare": false, 00:18:07.950 "compare_and_write": false, 00:18:07.950 "abort": true, 00:18:07.950 "nvme_admin": false, 00:18:07.950 "nvme_io": false 00:18:07.950 }, 00:18:07.950 "memory_domains": [ 00:18:07.950 { 00:18:07.950 "dma_device_id": "system", 00:18:07.950 "dma_device_type": 1 00:18:07.950 }, 00:18:07.950 { 00:18:07.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.950 "dma_device_type": 2 00:18:07.950 } 00:18:07.950 ], 00:18:07.950 "driver_specific": {} 00:18:07.950 } 00:18:07.950 ] 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.950 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.210 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.210 "name": "Existed_Raid", 00:18:08.210 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:08.210 "strip_size_kb": 64, 00:18:08.210 "state": "online", 00:18:08.211 "raid_level": "concat", 00:18:08.211 "superblock": true, 00:18:08.211 "num_base_bdevs": 4, 00:18:08.211 "num_base_bdevs_discovered": 4, 00:18:08.211 "num_base_bdevs_operational": 4, 00:18:08.211 "base_bdevs_list": [ 00:18:08.211 { 00:18:08.211 "name": "NewBaseBdev", 00:18:08.211 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:08.211 "is_configured": true, 00:18:08.211 "data_offset": 2048, 00:18:08.211 "data_size": 63488 00:18:08.211 }, 00:18:08.211 { 00:18:08.211 "name": "BaseBdev2", 00:18:08.211 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:08.211 "is_configured": true, 00:18:08.211 "data_offset": 2048, 00:18:08.211 "data_size": 63488 00:18:08.211 }, 00:18:08.211 { 00:18:08.211 "name": "BaseBdev3", 00:18:08.211 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:08.211 "is_configured": true, 00:18:08.211 "data_offset": 2048, 00:18:08.211 "data_size": 63488 00:18:08.211 }, 00:18:08.211 { 00:18:08.211 "name": "BaseBdev4", 00:18:08.211 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:08.211 "is_configured": true, 00:18:08.211 "data_offset": 2048, 00:18:08.211 "data_size": 63488 00:18:08.211 } 00:18:08.211 ] 00:18:08.211 }' 00:18:08.211 10:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.211 10:13:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:08.782 [2024-06-10 10:13:30.577493] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:08.782 "name": "Existed_Raid", 00:18:08.782 "aliases": [ 00:18:08.782 "28cb62f4-533b-46e7-bdd3-dbdea36d5516" 00:18:08.782 ], 00:18:08.782 "product_name": "Raid Volume", 00:18:08.782 "block_size": 512, 00:18:08.782 "num_blocks": 253952, 00:18:08.782 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:08.782 "assigned_rate_limits": { 00:18:08.782 "rw_ios_per_sec": 0, 00:18:08.782 "rw_mbytes_per_sec": 0, 00:18:08.782 "r_mbytes_per_sec": 0, 00:18:08.782 "w_mbytes_per_sec": 0 00:18:08.782 }, 00:18:08.782 "claimed": false, 00:18:08.782 "zoned": false, 00:18:08.782 "supported_io_types": { 00:18:08.782 "read": true, 00:18:08.782 "write": true, 00:18:08.782 "unmap": true, 00:18:08.782 "write_zeroes": true, 00:18:08.782 "flush": true, 00:18:08.782 "reset": true, 00:18:08.782 "compare": false, 00:18:08.782 "compare_and_write": false, 00:18:08.782 "abort": false, 00:18:08.782 "nvme_admin": false, 00:18:08.782 "nvme_io": false 00:18:08.782 }, 00:18:08.782 "memory_domains": [ 00:18:08.782 { 00:18:08.782 "dma_device_id": "system", 00:18:08.782 "dma_device_type": 1 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.782 "dma_device_type": 2 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "system", 00:18:08.782 "dma_device_type": 1 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.782 "dma_device_type": 2 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "system", 00:18:08.782 "dma_device_type": 1 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.782 "dma_device_type": 2 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "system", 00:18:08.782 "dma_device_type": 1 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.782 "dma_device_type": 2 00:18:08.782 } 00:18:08.782 ], 00:18:08.782 "driver_specific": { 00:18:08.782 "raid": { 00:18:08.782 "uuid": "28cb62f4-533b-46e7-bdd3-dbdea36d5516", 00:18:08.782 "strip_size_kb": 64, 00:18:08.782 "state": "online", 00:18:08.782 "raid_level": "concat", 00:18:08.782 "superblock": true, 00:18:08.782 "num_base_bdevs": 4, 00:18:08.782 "num_base_bdevs_discovered": 4, 00:18:08.782 "num_base_bdevs_operational": 4, 00:18:08.782 "base_bdevs_list": [ 00:18:08.782 { 00:18:08.782 "name": "NewBaseBdev", 00:18:08.782 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:08.782 "is_configured": true, 00:18:08.782 "data_offset": 2048, 00:18:08.782 "data_size": 63488 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "name": "BaseBdev2", 00:18:08.782 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:08.782 "is_configured": true, 00:18:08.782 "data_offset": 2048, 00:18:08.782 "data_size": 63488 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "name": "BaseBdev3", 00:18:08.782 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:08.782 "is_configured": true, 00:18:08.782 "data_offset": 2048, 00:18:08.782 "data_size": 63488 00:18:08.782 }, 00:18:08.782 { 00:18:08.782 "name": "BaseBdev4", 00:18:08.782 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:08.782 "is_configured": true, 00:18:08.782 "data_offset": 2048, 00:18:08.782 "data_size": 63488 00:18:08.782 } 00:18:08.782 ] 00:18:08.782 } 00:18:08.782 } 00:18:08.782 }' 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:08.782 BaseBdev2 00:18:08.782 BaseBdev3 00:18:08.782 BaseBdev4' 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:08.782 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:09.043 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:09.043 "name": "NewBaseBdev", 00:18:09.043 "aliases": [ 00:18:09.043 "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6" 00:18:09.043 ], 00:18:09.043 "product_name": "Malloc disk", 00:18:09.043 "block_size": 512, 00:18:09.043 "num_blocks": 65536, 00:18:09.043 "uuid": "5a9f9fba-21d4-4943-8c70-ca73c0cd4fb6", 00:18:09.043 "assigned_rate_limits": { 00:18:09.043 "rw_ios_per_sec": 0, 00:18:09.043 "rw_mbytes_per_sec": 0, 00:18:09.043 "r_mbytes_per_sec": 0, 00:18:09.043 "w_mbytes_per_sec": 0 00:18:09.043 }, 00:18:09.043 "claimed": true, 00:18:09.043 "claim_type": "exclusive_write", 00:18:09.043 "zoned": false, 00:18:09.043 "supported_io_types": { 00:18:09.043 "read": true, 00:18:09.043 "write": true, 00:18:09.043 "unmap": true, 00:18:09.043 "write_zeroes": true, 00:18:09.043 "flush": true, 00:18:09.043 "reset": true, 00:18:09.043 "compare": false, 00:18:09.043 "compare_and_write": false, 00:18:09.043 "abort": true, 00:18:09.043 "nvme_admin": false, 00:18:09.043 "nvme_io": false 00:18:09.043 }, 00:18:09.043 "memory_domains": [ 00:18:09.043 { 00:18:09.043 "dma_device_id": "system", 00:18:09.043 "dma_device_type": 1 00:18:09.043 }, 00:18:09.043 { 00:18:09.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.043 "dma_device_type": 2 00:18:09.043 } 00:18:09.043 ], 00:18:09.043 "driver_specific": {} 00:18:09.043 }' 00:18:09.043 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.043 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.303 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:09.304 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.304 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.304 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:09.304 10:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:09.304 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:09.563 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:09.563 "name": "BaseBdev2", 00:18:09.563 "aliases": [ 00:18:09.563 "8f14a589-d017-4621-bbb9-50030a948044" 00:18:09.563 ], 00:18:09.563 "product_name": "Malloc disk", 00:18:09.563 "block_size": 512, 00:18:09.563 "num_blocks": 65536, 00:18:09.563 "uuid": "8f14a589-d017-4621-bbb9-50030a948044", 00:18:09.563 "assigned_rate_limits": { 00:18:09.563 "rw_ios_per_sec": 0, 00:18:09.563 "rw_mbytes_per_sec": 0, 00:18:09.563 "r_mbytes_per_sec": 0, 00:18:09.563 "w_mbytes_per_sec": 0 00:18:09.563 }, 00:18:09.563 "claimed": true, 00:18:09.563 "claim_type": "exclusive_write", 00:18:09.563 "zoned": false, 00:18:09.563 "supported_io_types": { 00:18:09.563 "read": true, 00:18:09.563 "write": true, 00:18:09.563 "unmap": true, 00:18:09.563 "write_zeroes": true, 00:18:09.563 "flush": true, 00:18:09.563 "reset": true, 00:18:09.563 "compare": false, 00:18:09.563 "compare_and_write": false, 00:18:09.563 "abort": true, 00:18:09.563 "nvme_admin": false, 00:18:09.563 "nvme_io": false 00:18:09.563 }, 00:18:09.563 "memory_domains": [ 00:18:09.563 { 00:18:09.563 "dma_device_id": "system", 00:18:09.563 "dma_device_type": 1 00:18:09.563 }, 00:18:09.563 { 00:18:09.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.563 "dma_device_type": 2 00:18:09.563 } 00:18:09.563 ], 00:18:09.563 "driver_specific": {} 00:18:09.563 }' 00:18:09.563 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.563 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:09.824 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.084 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:10.084 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.084 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.084 "name": "BaseBdev3", 00:18:10.084 "aliases": [ 00:18:10.084 "345fdae6-0385-4269-a977-8e39d76cc238" 00:18:10.084 ], 00:18:10.084 "product_name": "Malloc disk", 00:18:10.084 "block_size": 512, 00:18:10.084 "num_blocks": 65536, 00:18:10.085 "uuid": "345fdae6-0385-4269-a977-8e39d76cc238", 00:18:10.085 "assigned_rate_limits": { 00:18:10.085 "rw_ios_per_sec": 0, 00:18:10.085 "rw_mbytes_per_sec": 0, 00:18:10.085 "r_mbytes_per_sec": 0, 00:18:10.085 "w_mbytes_per_sec": 0 00:18:10.085 }, 00:18:10.085 "claimed": true, 00:18:10.085 "claim_type": "exclusive_write", 00:18:10.085 "zoned": false, 00:18:10.085 "supported_io_types": { 00:18:10.085 "read": true, 00:18:10.085 "write": true, 00:18:10.085 "unmap": true, 00:18:10.085 "write_zeroes": true, 00:18:10.085 "flush": true, 00:18:10.085 "reset": true, 00:18:10.085 "compare": false, 00:18:10.085 "compare_and_write": false, 00:18:10.085 "abort": true, 00:18:10.085 "nvme_admin": false, 00:18:10.085 "nvme_io": false 00:18:10.085 }, 00:18:10.085 "memory_domains": [ 00:18:10.085 { 00:18:10.085 "dma_device_id": "system", 00:18:10.085 "dma_device_type": 1 00:18:10.085 }, 00:18:10.085 { 00:18:10.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.085 "dma_device_type": 2 00:18:10.085 } 00:18:10.085 ], 00:18:10.085 "driver_specific": {} 00:18:10.085 }' 00:18:10.085 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.085 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.354 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.354 10:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.354 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.621 "name": "BaseBdev4", 00:18:10.621 "aliases": [ 00:18:10.621 "0ec42772-44d3-4cca-90c8-6092b4863f28" 00:18:10.621 ], 00:18:10.621 "product_name": "Malloc disk", 00:18:10.621 "block_size": 512, 00:18:10.621 "num_blocks": 65536, 00:18:10.621 "uuid": "0ec42772-44d3-4cca-90c8-6092b4863f28", 00:18:10.621 "assigned_rate_limits": { 00:18:10.621 "rw_ios_per_sec": 0, 00:18:10.621 "rw_mbytes_per_sec": 0, 00:18:10.621 "r_mbytes_per_sec": 0, 00:18:10.621 "w_mbytes_per_sec": 0 00:18:10.621 }, 00:18:10.621 "claimed": true, 00:18:10.621 "claim_type": "exclusive_write", 00:18:10.621 "zoned": false, 00:18:10.621 "supported_io_types": { 00:18:10.621 "read": true, 00:18:10.621 "write": true, 00:18:10.621 "unmap": true, 00:18:10.621 "write_zeroes": true, 00:18:10.621 "flush": true, 00:18:10.621 "reset": true, 00:18:10.621 "compare": false, 00:18:10.621 "compare_and_write": false, 00:18:10.621 "abort": true, 00:18:10.621 "nvme_admin": false, 00:18:10.621 "nvme_io": false 00:18:10.621 }, 00:18:10.621 "memory_domains": [ 00:18:10.621 { 00:18:10.621 "dma_device_id": "system", 00:18:10.621 "dma_device_type": 1 00:18:10.621 }, 00:18:10.621 { 00:18:10.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.621 "dma_device_type": 2 00:18:10.621 } 00:18:10.621 ], 00:18:10.621 "driver_specific": {} 00:18:10.621 }' 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.621 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.882 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:11.142 [2024-06-10 10:13:32.951271] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:11.142 [2024-06-10 10:13:32.951290] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:11.142 [2024-06-10 10:13:32.951332] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.142 [2024-06-10 10:13:32.951380] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:11.142 [2024-06-10 10:13:32.951386] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27950f0 name Existed_Raid, state offline 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1042350 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1042350 ']' 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1042350 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:11.142 10:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1042350 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1042350' 00:18:11.403 killing process with pid 1042350 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1042350 00:18:11.403 [2024-06-10 10:13:33.016600] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1042350 00:18:11.403 [2024-06-10 10:13:33.037200] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:11.403 00:18:11.403 real 0m26.743s 00:18:11.403 user 0m50.232s 00:18:11.403 sys 0m3.832s 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:11.403 10:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.403 ************************************ 00:18:11.403 END TEST raid_state_function_test_sb 00:18:11.403 ************************************ 00:18:11.403 10:13:33 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:18:11.403 10:13:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:18:11.403 10:13:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:11.403 10:13:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:11.403 ************************************ 00:18:11.403 START TEST raid_superblock_test 00:18:11.403 ************************************ 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 4 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1047437 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1047437 /var/tmp/spdk-raid.sock 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1047437 ']' 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:11.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:11.403 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:11.404 10:13:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.665 [2024-06-10 10:13:33.293117] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:18:11.665 [2024-06-10 10:13:33.293164] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1047437 ] 00:18:11.665 [2024-06-10 10:13:33.381622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:11.665 [2024-06-10 10:13:33.445307] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:11.665 [2024-06-10 10:13:33.489539] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:11.665 [2024-06-10 10:13:33.489565] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:12.607 malloc1 00:18:12.607 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:12.868 [2024-06-10 10:13:34.491941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:12.868 [2024-06-10 10:13:34.491974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:12.868 [2024-06-10 10:13:34.491986] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc13990 00:18:12.868 [2024-06-10 10:13:34.491993] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:12.868 [2024-06-10 10:13:34.493386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:12.868 [2024-06-10 10:13:34.493405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:12.868 pt1 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:12.868 malloc2 00:18:12.868 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:13.128 [2024-06-10 10:13:34.890901] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:13.128 [2024-06-10 10:13:34.890929] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.128 [2024-06-10 10:13:34.890939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc144e0 00:18:13.128 [2024-06-10 10:13:34.890945] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.128 [2024-06-10 10:13:34.892124] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.128 [2024-06-10 10:13:34.892143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:13.128 pt2 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:13.128 10:13:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:13.429 malloc3 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:13.429 [2024-06-10 10:13:35.257545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:13.429 [2024-06-10 10:13:35.257574] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.429 [2024-06-10 10:13:35.257583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc04e0 00:18:13.429 [2024-06-10 10:13:35.257589] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.429 [2024-06-10 10:13:35.258766] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.429 [2024-06-10 10:13:35.258784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:13.429 pt3 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:13.429 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:13.690 malloc4 00:18:13.690 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:13.951 [2024-06-10 10:13:35.640419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:13.951 [2024-06-10 10:13:35.640446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.951 [2024-06-10 10:13:35.640455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc2f50 00:18:13.951 [2024-06-10 10:13:35.640461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.951 [2024-06-10 10:13:35.641627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.951 [2024-06-10 10:13:35.641644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:13.951 pt4 00:18:13.951 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:13.951 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:13.951 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:14.212 [2024-06-10 10:13:35.828902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:14.212 [2024-06-10 10:13:35.829893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:14.213 [2024-06-10 10:13:35.829934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:14.213 [2024-06-10 10:13:35.829967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:14.213 [2024-06-10 10:13:35.830100] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc4fb0 00:18:14.213 [2024-06-10 10:13:35.830107] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:14.213 [2024-06-10 10:13:35.830255] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc13590 00:18:14.213 [2024-06-10 10:13:35.830363] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc4fb0 00:18:14.213 [2024-06-10 10:13:35.830368] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc4fb0 00:18:14.213 [2024-06-10 10:13:35.830436] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.213 10:13:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:14.213 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.213 "name": "raid_bdev1", 00:18:14.213 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:14.213 "strip_size_kb": 64, 00:18:14.213 "state": "online", 00:18:14.213 "raid_level": "concat", 00:18:14.213 "superblock": true, 00:18:14.213 "num_base_bdevs": 4, 00:18:14.213 "num_base_bdevs_discovered": 4, 00:18:14.213 "num_base_bdevs_operational": 4, 00:18:14.213 "base_bdevs_list": [ 00:18:14.213 { 00:18:14.213 "name": "pt1", 00:18:14.213 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 }, 00:18:14.213 { 00:18:14.213 "name": "pt2", 00:18:14.213 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 }, 00:18:14.213 { 00:18:14.213 "name": "pt3", 00:18:14.213 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 }, 00:18:14.213 { 00:18:14.213 "name": "pt4", 00:18:14.213 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 } 00:18:14.213 ] 00:18:14.213 }' 00:18:14.213 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.213 10:13:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:14.784 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:15.045 [2024-06-10 10:13:36.755443] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:15.045 "name": "raid_bdev1", 00:18:15.045 "aliases": [ 00:18:15.045 "fd276be5-8ee8-4120-b3c3-c5872208e517" 00:18:15.045 ], 00:18:15.045 "product_name": "Raid Volume", 00:18:15.045 "block_size": 512, 00:18:15.045 "num_blocks": 253952, 00:18:15.045 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:15.045 "assigned_rate_limits": { 00:18:15.045 "rw_ios_per_sec": 0, 00:18:15.045 "rw_mbytes_per_sec": 0, 00:18:15.045 "r_mbytes_per_sec": 0, 00:18:15.045 "w_mbytes_per_sec": 0 00:18:15.045 }, 00:18:15.045 "claimed": false, 00:18:15.045 "zoned": false, 00:18:15.045 "supported_io_types": { 00:18:15.045 "read": true, 00:18:15.045 "write": true, 00:18:15.045 "unmap": true, 00:18:15.045 "write_zeroes": true, 00:18:15.045 "flush": true, 00:18:15.045 "reset": true, 00:18:15.045 "compare": false, 00:18:15.045 "compare_and_write": false, 00:18:15.045 "abort": false, 00:18:15.045 "nvme_admin": false, 00:18:15.045 "nvme_io": false 00:18:15.045 }, 00:18:15.045 "memory_domains": [ 00:18:15.045 { 00:18:15.045 "dma_device_id": "system", 00:18:15.045 "dma_device_type": 1 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.045 "dma_device_type": 2 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "system", 00:18:15.045 "dma_device_type": 1 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.045 "dma_device_type": 2 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "system", 00:18:15.045 "dma_device_type": 1 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.045 "dma_device_type": 2 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "system", 00:18:15.045 "dma_device_type": 1 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.045 "dma_device_type": 2 00:18:15.045 } 00:18:15.045 ], 00:18:15.045 "driver_specific": { 00:18:15.045 "raid": { 00:18:15.045 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:15.045 "strip_size_kb": 64, 00:18:15.045 "state": "online", 00:18:15.045 "raid_level": "concat", 00:18:15.045 "superblock": true, 00:18:15.045 "num_base_bdevs": 4, 00:18:15.045 "num_base_bdevs_discovered": 4, 00:18:15.045 "num_base_bdevs_operational": 4, 00:18:15.045 "base_bdevs_list": [ 00:18:15.045 { 00:18:15.045 "name": "pt1", 00:18:15.045 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:15.045 "is_configured": true, 00:18:15.045 "data_offset": 2048, 00:18:15.045 "data_size": 63488 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "name": "pt2", 00:18:15.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:15.045 "is_configured": true, 00:18:15.045 "data_offset": 2048, 00:18:15.045 "data_size": 63488 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "name": "pt3", 00:18:15.045 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:15.045 "is_configured": true, 00:18:15.045 "data_offset": 2048, 00:18:15.045 "data_size": 63488 00:18:15.045 }, 00:18:15.045 { 00:18:15.045 "name": "pt4", 00:18:15.045 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:15.045 "is_configured": true, 00:18:15.045 "data_offset": 2048, 00:18:15.045 "data_size": 63488 00:18:15.045 } 00:18:15.045 ] 00:18:15.045 } 00:18:15.045 } 00:18:15.045 }' 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:15.045 pt2 00:18:15.045 pt3 00:18:15.045 pt4' 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:15.045 10:13:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.306 "name": "pt1", 00:18:15.306 "aliases": [ 00:18:15.306 "00000000-0000-0000-0000-000000000001" 00:18:15.306 ], 00:18:15.306 "product_name": "passthru", 00:18:15.306 "block_size": 512, 00:18:15.306 "num_blocks": 65536, 00:18:15.306 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:15.306 "assigned_rate_limits": { 00:18:15.306 "rw_ios_per_sec": 0, 00:18:15.306 "rw_mbytes_per_sec": 0, 00:18:15.306 "r_mbytes_per_sec": 0, 00:18:15.306 "w_mbytes_per_sec": 0 00:18:15.306 }, 00:18:15.306 "claimed": true, 00:18:15.306 "claim_type": "exclusive_write", 00:18:15.306 "zoned": false, 00:18:15.306 "supported_io_types": { 00:18:15.306 "read": true, 00:18:15.306 "write": true, 00:18:15.306 "unmap": true, 00:18:15.306 "write_zeroes": true, 00:18:15.306 "flush": true, 00:18:15.306 "reset": true, 00:18:15.306 "compare": false, 00:18:15.306 "compare_and_write": false, 00:18:15.306 "abort": true, 00:18:15.306 "nvme_admin": false, 00:18:15.306 "nvme_io": false 00:18:15.306 }, 00:18:15.306 "memory_domains": [ 00:18:15.306 { 00:18:15.306 "dma_device_id": "system", 00:18:15.306 "dma_device_type": 1 00:18:15.306 }, 00:18:15.306 { 00:18:15.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.306 "dma_device_type": 2 00:18:15.306 } 00:18:15.306 ], 00:18:15.306 "driver_specific": { 00:18:15.306 "passthru": { 00:18:15.306 "name": "pt1", 00:18:15.306 "base_bdev_name": "malloc1" 00:18:15.306 } 00:18:15.306 } 00:18:15.306 }' 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.306 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:15.567 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.828 "name": "pt2", 00:18:15.828 "aliases": [ 00:18:15.828 "00000000-0000-0000-0000-000000000002" 00:18:15.828 ], 00:18:15.828 "product_name": "passthru", 00:18:15.828 "block_size": 512, 00:18:15.828 "num_blocks": 65536, 00:18:15.828 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:15.828 "assigned_rate_limits": { 00:18:15.828 "rw_ios_per_sec": 0, 00:18:15.828 "rw_mbytes_per_sec": 0, 00:18:15.828 "r_mbytes_per_sec": 0, 00:18:15.828 "w_mbytes_per_sec": 0 00:18:15.828 }, 00:18:15.828 "claimed": true, 00:18:15.828 "claim_type": "exclusive_write", 00:18:15.828 "zoned": false, 00:18:15.828 "supported_io_types": { 00:18:15.828 "read": true, 00:18:15.828 "write": true, 00:18:15.828 "unmap": true, 00:18:15.828 "write_zeroes": true, 00:18:15.828 "flush": true, 00:18:15.828 "reset": true, 00:18:15.828 "compare": false, 00:18:15.828 "compare_and_write": false, 00:18:15.828 "abort": true, 00:18:15.828 "nvme_admin": false, 00:18:15.828 "nvme_io": false 00:18:15.828 }, 00:18:15.828 "memory_domains": [ 00:18:15.828 { 00:18:15.828 "dma_device_id": "system", 00:18:15.828 "dma_device_type": 1 00:18:15.828 }, 00:18:15.828 { 00:18:15.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.828 "dma_device_type": 2 00:18:15.828 } 00:18:15.828 ], 00:18:15.828 "driver_specific": { 00:18:15.828 "passthru": { 00:18:15.828 "name": "pt2", 00:18:15.828 "base_bdev_name": "malloc2" 00:18:15.828 } 00:18:15.828 } 00:18:15.828 }' 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.828 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:16.089 10:13:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.351 "name": "pt3", 00:18:16.351 "aliases": [ 00:18:16.351 "00000000-0000-0000-0000-000000000003" 00:18:16.351 ], 00:18:16.351 "product_name": "passthru", 00:18:16.351 "block_size": 512, 00:18:16.351 "num_blocks": 65536, 00:18:16.351 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:16.351 "assigned_rate_limits": { 00:18:16.351 "rw_ios_per_sec": 0, 00:18:16.351 "rw_mbytes_per_sec": 0, 00:18:16.351 "r_mbytes_per_sec": 0, 00:18:16.351 "w_mbytes_per_sec": 0 00:18:16.351 }, 00:18:16.351 "claimed": true, 00:18:16.351 "claim_type": "exclusive_write", 00:18:16.351 "zoned": false, 00:18:16.351 "supported_io_types": { 00:18:16.351 "read": true, 00:18:16.351 "write": true, 00:18:16.351 "unmap": true, 00:18:16.351 "write_zeroes": true, 00:18:16.351 "flush": true, 00:18:16.351 "reset": true, 00:18:16.351 "compare": false, 00:18:16.351 "compare_and_write": false, 00:18:16.351 "abort": true, 00:18:16.351 "nvme_admin": false, 00:18:16.351 "nvme_io": false 00:18:16.351 }, 00:18:16.351 "memory_domains": [ 00:18:16.351 { 00:18:16.351 "dma_device_id": "system", 00:18:16.351 "dma_device_type": 1 00:18:16.351 }, 00:18:16.351 { 00:18:16.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.351 "dma_device_type": 2 00:18:16.351 } 00:18:16.351 ], 00:18:16.351 "driver_specific": { 00:18:16.351 "passthru": { 00:18:16.351 "name": "pt3", 00:18:16.351 "base_bdev_name": "malloc3" 00:18:16.351 } 00:18:16.351 } 00:18:16.351 }' 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.351 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:16.611 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.871 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.871 "name": "pt4", 00:18:16.871 "aliases": [ 00:18:16.871 "00000000-0000-0000-0000-000000000004" 00:18:16.871 ], 00:18:16.871 "product_name": "passthru", 00:18:16.871 "block_size": 512, 00:18:16.871 "num_blocks": 65536, 00:18:16.871 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:16.871 "assigned_rate_limits": { 00:18:16.871 "rw_ios_per_sec": 0, 00:18:16.871 "rw_mbytes_per_sec": 0, 00:18:16.871 "r_mbytes_per_sec": 0, 00:18:16.871 "w_mbytes_per_sec": 0 00:18:16.871 }, 00:18:16.871 "claimed": true, 00:18:16.871 "claim_type": "exclusive_write", 00:18:16.871 "zoned": false, 00:18:16.871 "supported_io_types": { 00:18:16.871 "read": true, 00:18:16.871 "write": true, 00:18:16.871 "unmap": true, 00:18:16.871 "write_zeroes": true, 00:18:16.871 "flush": true, 00:18:16.871 "reset": true, 00:18:16.871 "compare": false, 00:18:16.871 "compare_and_write": false, 00:18:16.871 "abort": true, 00:18:16.871 "nvme_admin": false, 00:18:16.871 "nvme_io": false 00:18:16.871 }, 00:18:16.871 "memory_domains": [ 00:18:16.871 { 00:18:16.871 "dma_device_id": "system", 00:18:16.871 "dma_device_type": 1 00:18:16.871 }, 00:18:16.871 { 00:18:16.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.871 "dma_device_type": 2 00:18:16.871 } 00:18:16.871 ], 00:18:16.871 "driver_specific": { 00:18:16.871 "passthru": { 00:18:16.871 "name": "pt4", 00:18:16.871 "base_bdev_name": "malloc4" 00:18:16.871 } 00:18:16.871 } 00:18:16.871 }' 00:18:16.871 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.871 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.871 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.871 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:17.132 10:13:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:17.391 [2024-06-10 10:13:39.129470] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.391 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fd276be5-8ee8-4120-b3c3-c5872208e517 00:18:17.391 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fd276be5-8ee8-4120-b3c3-c5872208e517 ']' 00:18:17.391 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:17.651 [2024-06-10 10:13:39.321736] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:17.651 [2024-06-10 10:13:39.321748] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:17.651 [2024-06-10 10:13:39.321783] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:17.651 [2024-06-10 10:13:39.321839] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:17.651 [2024-06-10 10:13:39.321846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc4fb0 name raid_bdev1, state offline 00:18:17.651 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.651 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:17.912 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:18.172 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:18.172 10:13:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:18.433 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:18.433 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:18.433 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:18.433 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:18.694 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:18.955 [2024-06-10 10:13:40.653060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:18.955 [2024-06-10 10:13:40.654124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:18.955 [2024-06-10 10:13:40.654156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:18.955 [2024-06-10 10:13:40.654182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:18.955 [2024-06-10 10:13:40.654217] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:18.955 [2024-06-10 10:13:40.654243] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:18.955 [2024-06-10 10:13:40.654257] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:18.955 [2024-06-10 10:13:40.654270] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:18.955 [2024-06-10 10:13:40.654280] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:18.955 [2024-06-10 10:13:40.654286] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc12a10 name raid_bdev1, state configuring 00:18:18.955 request: 00:18:18.955 { 00:18:18.955 "name": "raid_bdev1", 00:18:18.955 "raid_level": "concat", 00:18:18.955 "base_bdevs": [ 00:18:18.955 "malloc1", 00:18:18.955 "malloc2", 00:18:18.955 "malloc3", 00:18:18.955 "malloc4" 00:18:18.955 ], 00:18:18.955 "superblock": false, 00:18:18.955 "strip_size_kb": 64, 00:18:18.955 "method": "bdev_raid_create", 00:18:18.955 "req_id": 1 00:18:18.955 } 00:18:18.956 Got JSON-RPC error response 00:18:18.956 response: 00:18:18.956 { 00:18:18.956 "code": -17, 00:18:18.956 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:18.956 } 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.956 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:19.281 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:19.281 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:19.281 10:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:19.281 [2024-06-10 10:13:41.038129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:19.281 [2024-06-10 10:13:41.038151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.281 [2024-06-10 10:13:41.038161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc5250 00:18:19.281 [2024-06-10 10:13:41.038167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.281 [2024-06-10 10:13:41.039409] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.281 [2024-06-10 10:13:41.039429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:19.281 [2024-06-10 10:13:41.039473] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:19.281 [2024-06-10 10:13:41.039490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:19.281 pt1 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.281 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.541 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.541 "name": "raid_bdev1", 00:18:19.541 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:19.541 "strip_size_kb": 64, 00:18:19.541 "state": "configuring", 00:18:19.541 "raid_level": "concat", 00:18:19.541 "superblock": true, 00:18:19.541 "num_base_bdevs": 4, 00:18:19.541 "num_base_bdevs_discovered": 1, 00:18:19.541 "num_base_bdevs_operational": 4, 00:18:19.542 "base_bdevs_list": [ 00:18:19.542 { 00:18:19.542 "name": "pt1", 00:18:19.542 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:19.542 "is_configured": true, 00:18:19.542 "data_offset": 2048, 00:18:19.542 "data_size": 63488 00:18:19.542 }, 00:18:19.542 { 00:18:19.542 "name": null, 00:18:19.542 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:19.542 "is_configured": false, 00:18:19.542 "data_offset": 2048, 00:18:19.542 "data_size": 63488 00:18:19.542 }, 00:18:19.542 { 00:18:19.542 "name": null, 00:18:19.542 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:19.542 "is_configured": false, 00:18:19.542 "data_offset": 2048, 00:18:19.542 "data_size": 63488 00:18:19.542 }, 00:18:19.542 { 00:18:19.542 "name": null, 00:18:19.542 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:19.542 "is_configured": false, 00:18:19.542 "data_offset": 2048, 00:18:19.542 "data_size": 63488 00:18:19.542 } 00:18:19.542 ] 00:18:19.542 }' 00:18:19.542 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.542 10:13:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.111 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:20.111 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:20.111 [2024-06-10 10:13:41.936413] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:20.111 [2024-06-10 10:13:41.936442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.111 [2024-06-10 10:13:41.936452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdb2f80 00:18:20.111 [2024-06-10 10:13:41.936458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.111 [2024-06-10 10:13:41.936719] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.111 [2024-06-10 10:13:41.936729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:20.111 [2024-06-10 10:13:41.936770] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:20.111 [2024-06-10 10:13:41.936781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:20.111 pt2 00:18:20.111 10:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:20.371 [2024-06-10 10:13:42.124891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.371 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.631 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.631 "name": "raid_bdev1", 00:18:20.631 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:20.631 "strip_size_kb": 64, 00:18:20.631 "state": "configuring", 00:18:20.631 "raid_level": "concat", 00:18:20.631 "superblock": true, 00:18:20.631 "num_base_bdevs": 4, 00:18:20.631 "num_base_bdevs_discovered": 1, 00:18:20.631 "num_base_bdevs_operational": 4, 00:18:20.631 "base_bdevs_list": [ 00:18:20.631 { 00:18:20.631 "name": "pt1", 00:18:20.631 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:20.631 "is_configured": true, 00:18:20.631 "data_offset": 2048, 00:18:20.631 "data_size": 63488 00:18:20.631 }, 00:18:20.631 { 00:18:20.631 "name": null, 00:18:20.631 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.631 "is_configured": false, 00:18:20.631 "data_offset": 2048, 00:18:20.631 "data_size": 63488 00:18:20.631 }, 00:18:20.631 { 00:18:20.631 "name": null, 00:18:20.631 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:20.631 "is_configured": false, 00:18:20.631 "data_offset": 2048, 00:18:20.631 "data_size": 63488 00:18:20.631 }, 00:18:20.631 { 00:18:20.631 "name": null, 00:18:20.631 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:20.631 "is_configured": false, 00:18:20.631 "data_offset": 2048, 00:18:20.631 "data_size": 63488 00:18:20.631 } 00:18:20.631 ] 00:18:20.631 }' 00:18:20.631 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.631 10:13:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.201 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:21.201 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:21.201 10:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:21.201 [2024-06-10 10:13:43.043232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:21.201 [2024-06-10 10:13:43.043260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.201 [2024-06-10 10:13:43.043268] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc4d10 00:18:21.201 [2024-06-10 10:13:43.043275] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.201 [2024-06-10 10:13:43.043530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.201 [2024-06-10 10:13:43.043540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:21.201 [2024-06-10 10:13:43.043581] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:21.201 [2024-06-10 10:13:43.043592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:21.201 pt2 00:18:21.201 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:21.201 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:21.201 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:21.462 [2024-06-10 10:13:43.235725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:21.462 [2024-06-10 10:13:43.235745] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.462 [2024-06-10 10:13:43.235753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdbf4d0 00:18:21.462 [2024-06-10 10:13:43.235759] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.462 [2024-06-10 10:13:43.235978] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.462 [2024-06-10 10:13:43.235988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:21.462 [2024-06-10 10:13:43.236020] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:21.462 [2024-06-10 10:13:43.236030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:21.462 pt3 00:18:21.462 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:21.462 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:21.462 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:21.723 [2024-06-10 10:13:43.416181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:21.723 [2024-06-10 10:13:43.416200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.723 [2024-06-10 10:13:43.416209] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc0a20 00:18:21.723 [2024-06-10 10:13:43.416215] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.723 [2024-06-10 10:13:43.416416] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.723 [2024-06-10 10:13:43.416426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:21.723 [2024-06-10 10:13:43.416456] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:21.723 [2024-06-10 10:13:43.416466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:21.723 [2024-06-10 10:13:43.416554] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdbd750 00:18:21.723 [2024-06-10 10:13:43.416560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:21.723 [2024-06-10 10:13:43.416688] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdb5cc0 00:18:21.723 [2024-06-10 10:13:43.416785] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdbd750 00:18:21.723 [2024-06-10 10:13:43.416790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdbd750 00:18:21.723 [2024-06-10 10:13:43.416866] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:21.723 pt4 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.723 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.985 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.985 "name": "raid_bdev1", 00:18:21.985 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:21.985 "strip_size_kb": 64, 00:18:21.985 "state": "online", 00:18:21.985 "raid_level": "concat", 00:18:21.985 "superblock": true, 00:18:21.985 "num_base_bdevs": 4, 00:18:21.985 "num_base_bdevs_discovered": 4, 00:18:21.985 "num_base_bdevs_operational": 4, 00:18:21.985 "base_bdevs_list": [ 00:18:21.985 { 00:18:21.985 "name": "pt1", 00:18:21.985 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:21.985 "is_configured": true, 00:18:21.985 "data_offset": 2048, 00:18:21.985 "data_size": 63488 00:18:21.985 }, 00:18:21.985 { 00:18:21.985 "name": "pt2", 00:18:21.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.985 "is_configured": true, 00:18:21.985 "data_offset": 2048, 00:18:21.985 "data_size": 63488 00:18:21.985 }, 00:18:21.985 { 00:18:21.985 "name": "pt3", 00:18:21.985 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:21.985 "is_configured": true, 00:18:21.985 "data_offset": 2048, 00:18:21.985 "data_size": 63488 00:18:21.985 }, 00:18:21.985 { 00:18:21.985 "name": "pt4", 00:18:21.985 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:21.985 "is_configured": true, 00:18:21.985 "data_offset": 2048, 00:18:21.985 "data_size": 63488 00:18:21.985 } 00:18:21.985 ] 00:18:21.985 }' 00:18:21.985 10:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.985 10:13:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:22.557 [2024-06-10 10:13:44.338818] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:22.557 "name": "raid_bdev1", 00:18:22.557 "aliases": [ 00:18:22.557 "fd276be5-8ee8-4120-b3c3-c5872208e517" 00:18:22.557 ], 00:18:22.557 "product_name": "Raid Volume", 00:18:22.557 "block_size": 512, 00:18:22.557 "num_blocks": 253952, 00:18:22.557 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:22.557 "assigned_rate_limits": { 00:18:22.557 "rw_ios_per_sec": 0, 00:18:22.557 "rw_mbytes_per_sec": 0, 00:18:22.557 "r_mbytes_per_sec": 0, 00:18:22.557 "w_mbytes_per_sec": 0 00:18:22.557 }, 00:18:22.557 "claimed": false, 00:18:22.557 "zoned": false, 00:18:22.557 "supported_io_types": { 00:18:22.557 "read": true, 00:18:22.557 "write": true, 00:18:22.557 "unmap": true, 00:18:22.557 "write_zeroes": true, 00:18:22.557 "flush": true, 00:18:22.557 "reset": true, 00:18:22.557 "compare": false, 00:18:22.557 "compare_and_write": false, 00:18:22.557 "abort": false, 00:18:22.557 "nvme_admin": false, 00:18:22.557 "nvme_io": false 00:18:22.557 }, 00:18:22.557 "memory_domains": [ 00:18:22.557 { 00:18:22.557 "dma_device_id": "system", 00:18:22.557 "dma_device_type": 1 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.557 "dma_device_type": 2 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "system", 00:18:22.557 "dma_device_type": 1 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.557 "dma_device_type": 2 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "system", 00:18:22.557 "dma_device_type": 1 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.557 "dma_device_type": 2 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "system", 00:18:22.557 "dma_device_type": 1 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.557 "dma_device_type": 2 00:18:22.557 } 00:18:22.557 ], 00:18:22.557 "driver_specific": { 00:18:22.557 "raid": { 00:18:22.557 "uuid": "fd276be5-8ee8-4120-b3c3-c5872208e517", 00:18:22.557 "strip_size_kb": 64, 00:18:22.557 "state": "online", 00:18:22.557 "raid_level": "concat", 00:18:22.557 "superblock": true, 00:18:22.557 "num_base_bdevs": 4, 00:18:22.557 "num_base_bdevs_discovered": 4, 00:18:22.557 "num_base_bdevs_operational": 4, 00:18:22.557 "base_bdevs_list": [ 00:18:22.557 { 00:18:22.557 "name": "pt1", 00:18:22.557 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:22.557 "is_configured": true, 00:18:22.557 "data_offset": 2048, 00:18:22.557 "data_size": 63488 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "name": "pt2", 00:18:22.557 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:22.557 "is_configured": true, 00:18:22.557 "data_offset": 2048, 00:18:22.557 "data_size": 63488 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "name": "pt3", 00:18:22.557 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:22.557 "is_configured": true, 00:18:22.557 "data_offset": 2048, 00:18:22.557 "data_size": 63488 00:18:22.557 }, 00:18:22.557 { 00:18:22.557 "name": "pt4", 00:18:22.557 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:22.557 "is_configured": true, 00:18:22.557 "data_offset": 2048, 00:18:22.557 "data_size": 63488 00:18:22.557 } 00:18:22.557 ] 00:18:22.557 } 00:18:22.557 } 00:18:22.557 }' 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:22.557 pt2 00:18:22.557 pt3 00:18:22.557 pt4' 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:22.557 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.818 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.818 "name": "pt1", 00:18:22.818 "aliases": [ 00:18:22.818 "00000000-0000-0000-0000-000000000001" 00:18:22.818 ], 00:18:22.818 "product_name": "passthru", 00:18:22.818 "block_size": 512, 00:18:22.818 "num_blocks": 65536, 00:18:22.819 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:22.819 "assigned_rate_limits": { 00:18:22.819 "rw_ios_per_sec": 0, 00:18:22.819 "rw_mbytes_per_sec": 0, 00:18:22.819 "r_mbytes_per_sec": 0, 00:18:22.819 "w_mbytes_per_sec": 0 00:18:22.819 }, 00:18:22.819 "claimed": true, 00:18:22.819 "claim_type": "exclusive_write", 00:18:22.819 "zoned": false, 00:18:22.819 "supported_io_types": { 00:18:22.819 "read": true, 00:18:22.819 "write": true, 00:18:22.819 "unmap": true, 00:18:22.819 "write_zeroes": true, 00:18:22.819 "flush": true, 00:18:22.819 "reset": true, 00:18:22.819 "compare": false, 00:18:22.819 "compare_and_write": false, 00:18:22.819 "abort": true, 00:18:22.819 "nvme_admin": false, 00:18:22.819 "nvme_io": false 00:18:22.819 }, 00:18:22.819 "memory_domains": [ 00:18:22.819 { 00:18:22.819 "dma_device_id": "system", 00:18:22.819 "dma_device_type": 1 00:18:22.819 }, 00:18:22.819 { 00:18:22.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.819 "dma_device_type": 2 00:18:22.819 } 00:18:22.819 ], 00:18:22.819 "driver_specific": { 00:18:22.819 "passthru": { 00:18:22.819 "name": "pt1", 00:18:22.819 "base_bdev_name": "malloc1" 00:18:22.819 } 00:18:22.819 } 00:18:22.819 }' 00:18:22.819 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.819 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.819 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.819 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:23.079 10:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.340 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.340 "name": "pt2", 00:18:23.340 "aliases": [ 00:18:23.340 "00000000-0000-0000-0000-000000000002" 00:18:23.340 ], 00:18:23.340 "product_name": "passthru", 00:18:23.340 "block_size": 512, 00:18:23.340 "num_blocks": 65536, 00:18:23.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:23.340 "assigned_rate_limits": { 00:18:23.340 "rw_ios_per_sec": 0, 00:18:23.340 "rw_mbytes_per_sec": 0, 00:18:23.340 "r_mbytes_per_sec": 0, 00:18:23.340 "w_mbytes_per_sec": 0 00:18:23.340 }, 00:18:23.340 "claimed": true, 00:18:23.340 "claim_type": "exclusive_write", 00:18:23.340 "zoned": false, 00:18:23.340 "supported_io_types": { 00:18:23.340 "read": true, 00:18:23.340 "write": true, 00:18:23.340 "unmap": true, 00:18:23.340 "write_zeroes": true, 00:18:23.340 "flush": true, 00:18:23.340 "reset": true, 00:18:23.340 "compare": false, 00:18:23.340 "compare_and_write": false, 00:18:23.340 "abort": true, 00:18:23.340 "nvme_admin": false, 00:18:23.340 "nvme_io": false 00:18:23.340 }, 00:18:23.340 "memory_domains": [ 00:18:23.340 { 00:18:23.340 "dma_device_id": "system", 00:18:23.340 "dma_device_type": 1 00:18:23.340 }, 00:18:23.340 { 00:18:23.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.340 "dma_device_type": 2 00:18:23.340 } 00:18:23.340 ], 00:18:23.340 "driver_specific": { 00:18:23.340 "passthru": { 00:18:23.340 "name": "pt2", 00:18:23.340 "base_bdev_name": "malloc2" 00:18:23.340 } 00:18:23.340 } 00:18:23.340 }' 00:18:23.340 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.340 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:23.601 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.862 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.862 "name": "pt3", 00:18:23.862 "aliases": [ 00:18:23.862 "00000000-0000-0000-0000-000000000003" 00:18:23.862 ], 00:18:23.862 "product_name": "passthru", 00:18:23.862 "block_size": 512, 00:18:23.862 "num_blocks": 65536, 00:18:23.862 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:23.862 "assigned_rate_limits": { 00:18:23.862 "rw_ios_per_sec": 0, 00:18:23.862 "rw_mbytes_per_sec": 0, 00:18:23.862 "r_mbytes_per_sec": 0, 00:18:23.862 "w_mbytes_per_sec": 0 00:18:23.862 }, 00:18:23.862 "claimed": true, 00:18:23.862 "claim_type": "exclusive_write", 00:18:23.862 "zoned": false, 00:18:23.862 "supported_io_types": { 00:18:23.862 "read": true, 00:18:23.862 "write": true, 00:18:23.862 "unmap": true, 00:18:23.862 "write_zeroes": true, 00:18:23.862 "flush": true, 00:18:23.862 "reset": true, 00:18:23.862 "compare": false, 00:18:23.862 "compare_and_write": false, 00:18:23.862 "abort": true, 00:18:23.862 "nvme_admin": false, 00:18:23.862 "nvme_io": false 00:18:23.862 }, 00:18:23.862 "memory_domains": [ 00:18:23.862 { 00:18:23.862 "dma_device_id": "system", 00:18:23.862 "dma_device_type": 1 00:18:23.862 }, 00:18:23.862 { 00:18:23.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.862 "dma_device_type": 2 00:18:23.862 } 00:18:23.862 ], 00:18:23.862 "driver_specific": { 00:18:23.862 "passthru": { 00:18:23.862 "name": "pt3", 00:18:23.862 "base_bdev_name": "malloc3" 00:18:23.862 } 00:18:23.862 } 00:18:23.862 }' 00:18:23.862 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.862 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.123 10:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.383 "name": "pt4", 00:18:24.383 "aliases": [ 00:18:24.383 "00000000-0000-0000-0000-000000000004" 00:18:24.383 ], 00:18:24.383 "product_name": "passthru", 00:18:24.383 "block_size": 512, 00:18:24.383 "num_blocks": 65536, 00:18:24.383 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:24.383 "assigned_rate_limits": { 00:18:24.383 "rw_ios_per_sec": 0, 00:18:24.383 "rw_mbytes_per_sec": 0, 00:18:24.383 "r_mbytes_per_sec": 0, 00:18:24.383 "w_mbytes_per_sec": 0 00:18:24.383 }, 00:18:24.383 "claimed": true, 00:18:24.383 "claim_type": "exclusive_write", 00:18:24.383 "zoned": false, 00:18:24.383 "supported_io_types": { 00:18:24.383 "read": true, 00:18:24.383 "write": true, 00:18:24.383 "unmap": true, 00:18:24.383 "write_zeroes": true, 00:18:24.383 "flush": true, 00:18:24.383 "reset": true, 00:18:24.383 "compare": false, 00:18:24.383 "compare_and_write": false, 00:18:24.383 "abort": true, 00:18:24.383 "nvme_admin": false, 00:18:24.383 "nvme_io": false 00:18:24.383 }, 00:18:24.383 "memory_domains": [ 00:18:24.383 { 00:18:24.383 "dma_device_id": "system", 00:18:24.383 "dma_device_type": 1 00:18:24.383 }, 00:18:24.383 { 00:18:24.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.383 "dma_device_type": 2 00:18:24.383 } 00:18:24.383 ], 00:18:24.383 "driver_specific": { 00:18:24.383 "passthru": { 00:18:24.383 "name": "pt4", 00:18:24.383 "base_bdev_name": "malloc4" 00:18:24.383 } 00:18:24.383 } 00:18:24.383 }' 00:18:24.383 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.645 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:24.905 [2024-06-10 10:13:46.724886] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fd276be5-8ee8-4120-b3c3-c5872208e517 '!=' fd276be5-8ee8-4120-b3c3-c5872208e517 ']' 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1047437 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1047437 ']' 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1047437 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:24.905 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1047437 00:18:25.166 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:25.166 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:25.166 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1047437' 00:18:25.167 killing process with pid 1047437 00:18:25.167 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1047437 00:18:25.167 [2024-06-10 10:13:46.779922] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:25.167 [2024-06-10 10:13:46.779964] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:25.167 [2024-06-10 10:13:46.780011] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:25.167 [2024-06-10 10:13:46.780017] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdbd750 name raid_bdev1, state offline 00:18:25.167 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1047437 00:18:25.167 [2024-06-10 10:13:46.800450] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:25.167 10:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:25.167 00:18:25.167 real 0m13.681s 00:18:25.167 user 0m25.220s 00:18:25.167 sys 0m1.987s 00:18:25.167 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:25.167 10:13:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.167 ************************************ 00:18:25.167 END TEST raid_superblock_test 00:18:25.167 ************************************ 00:18:25.167 10:13:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:18:25.167 10:13:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:18:25.167 10:13:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:25.167 10:13:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:25.167 ************************************ 00:18:25.167 START TEST raid_read_error_test 00:18:25.167 ************************************ 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 read 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:25.167 10:13:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Tx7fLpKnHO 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1050081 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1050081 /var/tmp/spdk-raid.sock 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1050081 ']' 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:25.167 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:25.167 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.427 [2024-06-10 10:13:47.065337] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:18:25.427 [2024-06-10 10:13:47.065384] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1050081 ] 00:18:25.427 [2024-06-10 10:13:47.152013] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:25.427 [2024-06-10 10:13:47.218454] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:25.427 [2024-06-10 10:13:47.259883] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:25.427 [2024-06-10 10:13:47.259906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.370 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:26.370 10:13:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:18:26.370 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:26.370 10:13:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:26.370 BaseBdev1_malloc 00:18:26.370 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:26.631 true 00:18:26.631 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:26.631 [2024-06-10 10:13:48.418487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:26.631 [2024-06-10 10:13:48.418518] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:26.631 [2024-06-10 10:13:48.418528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d5d10 00:18:26.631 [2024-06-10 10:13:48.418534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:26.631 [2024-06-10 10:13:48.419870] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:26.631 [2024-06-10 10:13:48.419888] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:26.631 BaseBdev1 00:18:26.631 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:26.631 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:26.891 BaseBdev2_malloc 00:18:26.891 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:27.152 true 00:18:27.152 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:27.152 [2024-06-10 10:13:48.985870] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:27.152 [2024-06-10 10:13:48.985900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.152 [2024-06-10 10:13:48.985911] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24da710 00:18:27.152 [2024-06-10 10:13:48.985917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.152 [2024-06-10 10:13:48.987099] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.152 [2024-06-10 10:13:48.987118] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:27.152 BaseBdev2 00:18:27.152 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:27.152 10:13:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:27.412 BaseBdev3_malloc 00:18:27.412 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:27.672 true 00:18:27.672 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:27.672 [2024-06-10 10:13:49.529132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:27.672 [2024-06-10 10:13:49.529157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.672 [2024-06-10 10:13:49.529166] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24db340 00:18:27.672 [2024-06-10 10:13:49.529172] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.672 [2024-06-10 10:13:49.530320] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.672 [2024-06-10 10:13:49.530338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:27.672 BaseBdev3 00:18:27.932 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:27.932 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:27.932 BaseBdev4_malloc 00:18:27.932 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:28.192 true 00:18:28.192 10:13:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:28.451 [2024-06-10 10:13:50.088476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:28.451 [2024-06-10 10:13:50.088510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.451 [2024-06-10 10:13:50.088524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d4aa0 00:18:28.451 [2024-06-10 10:13:50.088530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.451 [2024-06-10 10:13:50.089780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.451 [2024-06-10 10:13:50.089800] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:28.451 BaseBdev4 00:18:28.451 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:28.451 [2024-06-10 10:13:50.280978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:28.451 [2024-06-10 10:13:50.282027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.451 [2024-06-10 10:13:50.282078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:28.451 [2024-06-10 10:13:50.282126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:28.451 [2024-06-10 10:13:50.282304] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24de1b0 00:18:28.452 [2024-06-10 10:13:50.282311] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:28.452 [2024-06-10 10:13:50.282459] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23250d0 00:18:28.452 [2024-06-10 10:13:50.282575] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24de1b0 00:18:28.452 [2024-06-10 10:13:50.282580] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24de1b0 00:18:28.452 [2024-06-10 10:13:50.282655] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.452 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:28.711 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.711 "name": "raid_bdev1", 00:18:28.711 "uuid": "bce168f6-6749-4bf9-b1b0-cc7e9e6586b6", 00:18:28.711 "strip_size_kb": 64, 00:18:28.711 "state": "online", 00:18:28.711 "raid_level": "concat", 00:18:28.711 "superblock": true, 00:18:28.711 "num_base_bdevs": 4, 00:18:28.711 "num_base_bdevs_discovered": 4, 00:18:28.711 "num_base_bdevs_operational": 4, 00:18:28.711 "base_bdevs_list": [ 00:18:28.711 { 00:18:28.711 "name": "BaseBdev1", 00:18:28.711 "uuid": "7b442ff9-99ff-503b-8c26-0c6e90106bbb", 00:18:28.711 "is_configured": true, 00:18:28.711 "data_offset": 2048, 00:18:28.711 "data_size": 63488 00:18:28.711 }, 00:18:28.711 { 00:18:28.711 "name": "BaseBdev2", 00:18:28.711 "uuid": "e171325f-ecb4-551c-83a4-bc3e09a8816d", 00:18:28.711 "is_configured": true, 00:18:28.711 "data_offset": 2048, 00:18:28.711 "data_size": 63488 00:18:28.711 }, 00:18:28.711 { 00:18:28.711 "name": "BaseBdev3", 00:18:28.711 "uuid": "2eb736ee-c0cc-5ecc-b5a1-d83f9401c4bc", 00:18:28.711 "is_configured": true, 00:18:28.711 "data_offset": 2048, 00:18:28.711 "data_size": 63488 00:18:28.711 }, 00:18:28.711 { 00:18:28.711 "name": "BaseBdev4", 00:18:28.711 "uuid": "6105b185-cb2e-5006-b5f8-36aa7f81ea61", 00:18:28.711 "is_configured": true, 00:18:28.711 "data_offset": 2048, 00:18:28.711 "data_size": 63488 00:18:28.711 } 00:18:28.711 ] 00:18:28.711 }' 00:18:28.711 10:13:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.711 10:13:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.281 10:13:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:29.281 10:13:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:29.281 [2024-06-10 10:13:51.107232] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d7060 00:18:30.221 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.480 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:30.740 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.740 "name": "raid_bdev1", 00:18:30.740 "uuid": "bce168f6-6749-4bf9-b1b0-cc7e9e6586b6", 00:18:30.740 "strip_size_kb": 64, 00:18:30.740 "state": "online", 00:18:30.740 "raid_level": "concat", 00:18:30.740 "superblock": true, 00:18:30.740 "num_base_bdevs": 4, 00:18:30.740 "num_base_bdevs_discovered": 4, 00:18:30.740 "num_base_bdevs_operational": 4, 00:18:30.740 "base_bdevs_list": [ 00:18:30.740 { 00:18:30.740 "name": "BaseBdev1", 00:18:30.740 "uuid": "7b442ff9-99ff-503b-8c26-0c6e90106bbb", 00:18:30.740 "is_configured": true, 00:18:30.740 "data_offset": 2048, 00:18:30.740 "data_size": 63488 00:18:30.740 }, 00:18:30.740 { 00:18:30.740 "name": "BaseBdev2", 00:18:30.740 "uuid": "e171325f-ecb4-551c-83a4-bc3e09a8816d", 00:18:30.740 "is_configured": true, 00:18:30.740 "data_offset": 2048, 00:18:30.740 "data_size": 63488 00:18:30.740 }, 00:18:30.740 { 00:18:30.740 "name": "BaseBdev3", 00:18:30.740 "uuid": "2eb736ee-c0cc-5ecc-b5a1-d83f9401c4bc", 00:18:30.740 "is_configured": true, 00:18:30.740 "data_offset": 2048, 00:18:30.740 "data_size": 63488 00:18:30.740 }, 00:18:30.740 { 00:18:30.740 "name": "BaseBdev4", 00:18:30.740 "uuid": "6105b185-cb2e-5006-b5f8-36aa7f81ea61", 00:18:30.740 "is_configured": true, 00:18:30.740 "data_offset": 2048, 00:18:30.740 "data_size": 63488 00:18:30.740 } 00:18:30.740 ] 00:18:30.740 }' 00:18:30.740 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.740 10:13:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.310 10:13:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:31.310 [2024-06-10 10:13:53.136077] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:31.310 [2024-06-10 10:13:53.136110] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:31.310 [2024-06-10 10:13:53.138875] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:31.310 [2024-06-10 10:13:53.138901] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:31.310 [2024-06-10 10:13:53.138930] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:31.310 [2024-06-10 10:13:53.138935] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24de1b0 name raid_bdev1, state offline 00:18:31.310 0 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1050081 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1050081 ']' 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1050081 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:31.311 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1050081 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1050081' 00:18:31.571 killing process with pid 1050081 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1050081 00:18:31.571 [2024-06-10 10:13:53.201140] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1050081 00:18:31.571 [2024-06-10 10:13:53.218204] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Tx7fLpKnHO 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:18:31.571 00:18:31.571 real 0m6.353s 00:18:31.571 user 0m10.249s 00:18:31.571 sys 0m0.850s 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:31.571 10:13:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.571 ************************************ 00:18:31.571 END TEST raid_read_error_test 00:18:31.571 ************************************ 00:18:31.571 10:13:53 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:18:31.571 10:13:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:18:31.571 10:13:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:31.571 10:13:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:31.571 ************************************ 00:18:31.571 START TEST raid_write_error_test 00:18:31.571 ************************************ 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 write 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:31.571 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.awk9eIZSBr 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1051227 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1051227 /var/tmp/spdk-raid.sock 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1051227 ']' 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:31.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:31.832 10:13:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.832 [2024-06-10 10:13:53.509802] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:18:31.832 [2024-06-10 10:13:53.509852] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1051227 ] 00:18:31.832 [2024-06-10 10:13:53.596160] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:31.832 [2024-06-10 10:13:53.657873] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:32.092 [2024-06-10 10:13:53.704777] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:32.092 [2024-06-10 10:13:53.704802] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:32.662 10:13:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:32.662 10:13:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:18:32.662 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:32.662 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:32.662 BaseBdev1_malloc 00:18:32.662 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:32.922 true 00:18:32.922 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:33.182 [2024-06-10 10:13:54.855362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:33.182 [2024-06-10 10:13:54.855393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:33.182 [2024-06-10 10:13:54.855403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1614d10 00:18:33.182 [2024-06-10 10:13:54.855409] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:33.182 [2024-06-10 10:13:54.856731] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:33.182 [2024-06-10 10:13:54.856750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:33.182 BaseBdev1 00:18:33.182 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.182 10:13:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:33.182 BaseBdev2_malloc 00:18:33.442 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:33.442 true 00:18:33.442 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:33.702 [2024-06-10 10:13:55.406415] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:33.702 [2024-06-10 10:13:55.406444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:33.702 [2024-06-10 10:13:55.406454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1619710 00:18:33.702 [2024-06-10 10:13:55.406460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:33.703 [2024-06-10 10:13:55.407611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:33.703 [2024-06-10 10:13:55.407629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:33.703 BaseBdev2 00:18:33.703 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.703 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:33.963 BaseBdev3_malloc 00:18:33.963 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:33.963 true 00:18:33.963 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:34.223 [2024-06-10 10:13:55.957520] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:34.223 [2024-06-10 10:13:55.957548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.223 [2024-06-10 10:13:55.957557] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161a340 00:18:34.223 [2024-06-10 10:13:55.957563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.223 [2024-06-10 10:13:55.958703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.223 [2024-06-10 10:13:55.958721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:34.223 BaseBdev3 00:18:34.223 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:34.223 10:13:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:34.483 BaseBdev4_malloc 00:18:34.483 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:34.483 true 00:18:34.483 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:34.742 [2024-06-10 10:13:56.500534] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:34.742 [2024-06-10 10:13:56.500558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.742 [2024-06-10 10:13:56.500570] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1613aa0 00:18:34.742 [2024-06-10 10:13:56.500576] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.742 [2024-06-10 10:13:56.501722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.742 [2024-06-10 10:13:56.501739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:34.742 BaseBdev4 00:18:34.742 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:35.001 [2024-06-10 10:13:56.681016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:35.001 [2024-06-10 10:13:56.681985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:35.001 [2024-06-10 10:13:56.682035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:35.001 [2024-06-10 10:13:56.682083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:35.001 [2024-06-10 10:13:56.682257] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x161d1b0 00:18:35.001 [2024-06-10 10:13:56.682264] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:35.001 [2024-06-10 10:13:56.682398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14640d0 00:18:35.001 [2024-06-10 10:13:56.682514] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x161d1b0 00:18:35.001 [2024-06-10 10:13:56.682520] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x161d1b0 00:18:35.001 [2024-06-10 10:13:56.682591] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.001 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:35.001 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:35.001 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.002 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.262 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.262 "name": "raid_bdev1", 00:18:35.262 "uuid": "6731e1aa-7170-4f2c-b62a-07c8c7248757", 00:18:35.262 "strip_size_kb": 64, 00:18:35.262 "state": "online", 00:18:35.262 "raid_level": "concat", 00:18:35.262 "superblock": true, 00:18:35.262 "num_base_bdevs": 4, 00:18:35.262 "num_base_bdevs_discovered": 4, 00:18:35.262 "num_base_bdevs_operational": 4, 00:18:35.262 "base_bdevs_list": [ 00:18:35.262 { 00:18:35.262 "name": "BaseBdev1", 00:18:35.262 "uuid": "62c7efc0-66be-50aa-90fa-ec8af4dddaf4", 00:18:35.262 "is_configured": true, 00:18:35.262 "data_offset": 2048, 00:18:35.262 "data_size": 63488 00:18:35.262 }, 00:18:35.262 { 00:18:35.262 "name": "BaseBdev2", 00:18:35.262 "uuid": "1f965701-0b16-5c07-a4aa-ed5840e920e2", 00:18:35.262 "is_configured": true, 00:18:35.262 "data_offset": 2048, 00:18:35.262 "data_size": 63488 00:18:35.262 }, 00:18:35.262 { 00:18:35.262 "name": "BaseBdev3", 00:18:35.262 "uuid": "34dfa8cd-7552-522a-870a-cbfdbdbd5c56", 00:18:35.262 "is_configured": true, 00:18:35.262 "data_offset": 2048, 00:18:35.262 "data_size": 63488 00:18:35.262 }, 00:18:35.262 { 00:18:35.262 "name": "BaseBdev4", 00:18:35.262 "uuid": "6495e058-aa8d-5b9e-a2b9-8367b4c6b9c2", 00:18:35.262 "is_configured": true, 00:18:35.262 "data_offset": 2048, 00:18:35.262 "data_size": 63488 00:18:35.262 } 00:18:35.262 ] 00:18:35.262 }' 00:18:35.262 10:13:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.262 10:13:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.834 10:13:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:35.834 10:13:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:35.834 [2024-06-10 10:13:57.535361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1616060 00:18:36.776 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.036 "name": "raid_bdev1", 00:18:37.036 "uuid": "6731e1aa-7170-4f2c-b62a-07c8c7248757", 00:18:37.036 "strip_size_kb": 64, 00:18:37.036 "state": "online", 00:18:37.036 "raid_level": "concat", 00:18:37.036 "superblock": true, 00:18:37.036 "num_base_bdevs": 4, 00:18:37.036 "num_base_bdevs_discovered": 4, 00:18:37.036 "num_base_bdevs_operational": 4, 00:18:37.036 "base_bdevs_list": [ 00:18:37.036 { 00:18:37.036 "name": "BaseBdev1", 00:18:37.036 "uuid": "62c7efc0-66be-50aa-90fa-ec8af4dddaf4", 00:18:37.036 "is_configured": true, 00:18:37.036 "data_offset": 2048, 00:18:37.036 "data_size": 63488 00:18:37.036 }, 00:18:37.036 { 00:18:37.036 "name": "BaseBdev2", 00:18:37.036 "uuid": "1f965701-0b16-5c07-a4aa-ed5840e920e2", 00:18:37.036 "is_configured": true, 00:18:37.036 "data_offset": 2048, 00:18:37.036 "data_size": 63488 00:18:37.036 }, 00:18:37.036 { 00:18:37.036 "name": "BaseBdev3", 00:18:37.036 "uuid": "34dfa8cd-7552-522a-870a-cbfdbdbd5c56", 00:18:37.036 "is_configured": true, 00:18:37.036 "data_offset": 2048, 00:18:37.036 "data_size": 63488 00:18:37.036 }, 00:18:37.036 { 00:18:37.036 "name": "BaseBdev4", 00:18:37.036 "uuid": "6495e058-aa8d-5b9e-a2b9-8367b4c6b9c2", 00:18:37.036 "is_configured": true, 00:18:37.036 "data_offset": 2048, 00:18:37.036 "data_size": 63488 00:18:37.036 } 00:18:37.036 ] 00:18:37.036 }' 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.036 10:13:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.608 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:37.869 [2024-06-10 10:13:59.569709] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:37.869 [2024-06-10 10:13:59.569740] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:37.869 [2024-06-10 10:13:59.572337] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:37.869 [2024-06-10 10:13:59.572364] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.869 [2024-06-10 10:13:59.572393] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:37.869 [2024-06-10 10:13:59.572399] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x161d1b0 name raid_bdev1, state offline 00:18:37.869 0 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1051227 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1051227 ']' 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1051227 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1051227 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1051227' 00:18:37.869 killing process with pid 1051227 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1051227 00:18:37.869 [2024-06-10 10:13:59.638155] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:37.869 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1051227 00:18:37.869 [2024-06-10 10:13:59.655245] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.awk9eIZSBr 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:18:38.130 00:18:38.130 real 0m6.364s 00:18:38.130 user 0m10.271s 00:18:38.130 sys 0m0.842s 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:38.130 10:13:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.130 ************************************ 00:18:38.130 END TEST raid_write_error_test 00:18:38.130 ************************************ 00:18:38.130 10:13:59 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:38.130 10:13:59 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:18:38.130 10:13:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:18:38.130 10:13:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:38.130 10:13:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:38.130 ************************************ 00:18:38.130 START TEST raid_state_function_test 00:18:38.130 ************************************ 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 false 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1052428 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1052428' 00:18:38.130 Process raid pid: 1052428 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1052428 /var/tmp/spdk-raid.sock 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1052428 ']' 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:38.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:38.130 10:13:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.130 [2024-06-10 10:13:59.923604] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:18:38.130 [2024-06-10 10:13:59.923646] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:38.130 [2024-06-10 10:13:59.989276] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:38.391 [2024-06-10 10:14:00.053984] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:38.391 [2024-06-10 10:14:00.103296] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:38.391 [2024-06-10 10:14:00.103321] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:38.962 10:14:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:38.962 10:14:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:18:38.962 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:39.223 [2024-06-10 10:14:00.934695] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:39.223 [2024-06-10 10:14:00.934726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:39.223 [2024-06-10 10:14:00.934732] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:39.223 [2024-06-10 10:14:00.934739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:39.223 [2024-06-10 10:14:00.934745] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:39.223 [2024-06-10 10:14:00.934750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:39.223 [2024-06-10 10:14:00.934755] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:39.223 [2024-06-10 10:14:00.934760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.223 10:14:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:39.483 10:14:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.483 "name": "Existed_Raid", 00:18:39.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.483 "strip_size_kb": 0, 00:18:39.483 "state": "configuring", 00:18:39.483 "raid_level": "raid1", 00:18:39.483 "superblock": false, 00:18:39.483 "num_base_bdevs": 4, 00:18:39.483 "num_base_bdevs_discovered": 0, 00:18:39.483 "num_base_bdevs_operational": 4, 00:18:39.483 "base_bdevs_list": [ 00:18:39.483 { 00:18:39.483 "name": "BaseBdev1", 00:18:39.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.483 "is_configured": false, 00:18:39.483 "data_offset": 0, 00:18:39.483 "data_size": 0 00:18:39.483 }, 00:18:39.483 { 00:18:39.483 "name": "BaseBdev2", 00:18:39.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.483 "is_configured": false, 00:18:39.483 "data_offset": 0, 00:18:39.483 "data_size": 0 00:18:39.483 }, 00:18:39.483 { 00:18:39.483 "name": "BaseBdev3", 00:18:39.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.483 "is_configured": false, 00:18:39.483 "data_offset": 0, 00:18:39.483 "data_size": 0 00:18:39.483 }, 00:18:39.483 { 00:18:39.483 "name": "BaseBdev4", 00:18:39.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.483 "is_configured": false, 00:18:39.483 "data_offset": 0, 00:18:39.483 "data_size": 0 00:18:39.483 } 00:18:39.483 ] 00:18:39.483 }' 00:18:39.483 10:14:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.483 10:14:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.055 10:14:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:40.055 [2024-06-10 10:14:01.816831] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:40.055 [2024-06-10 10:14:01.816846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa7b20 name Existed_Raid, state configuring 00:18:40.055 10:14:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:40.315 [2024-06-10 10:14:02.005315] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:40.315 [2024-06-10 10:14:02.005334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:40.315 [2024-06-10 10:14:02.005338] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:40.315 [2024-06-10 10:14:02.005344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:40.315 [2024-06-10 10:14:02.005349] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:40.315 [2024-06-10 10:14:02.005354] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:40.315 [2024-06-10 10:14:02.005359] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:40.315 [2024-06-10 10:14:02.005364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:40.315 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:40.576 [2024-06-10 10:14:02.200522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:40.576 BaseBdev1 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.576 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:40.837 [ 00:18:40.837 { 00:18:40.837 "name": "BaseBdev1", 00:18:40.837 "aliases": [ 00:18:40.837 "cce94509-f797-4214-9ef6-cac865ea03c2" 00:18:40.837 ], 00:18:40.837 "product_name": "Malloc disk", 00:18:40.837 "block_size": 512, 00:18:40.837 "num_blocks": 65536, 00:18:40.837 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:40.837 "assigned_rate_limits": { 00:18:40.837 "rw_ios_per_sec": 0, 00:18:40.837 "rw_mbytes_per_sec": 0, 00:18:40.837 "r_mbytes_per_sec": 0, 00:18:40.837 "w_mbytes_per_sec": 0 00:18:40.837 }, 00:18:40.837 "claimed": true, 00:18:40.837 "claim_type": "exclusive_write", 00:18:40.837 "zoned": false, 00:18:40.837 "supported_io_types": { 00:18:40.837 "read": true, 00:18:40.837 "write": true, 00:18:40.837 "unmap": true, 00:18:40.837 "write_zeroes": true, 00:18:40.837 "flush": true, 00:18:40.837 "reset": true, 00:18:40.837 "compare": false, 00:18:40.837 "compare_and_write": false, 00:18:40.837 "abort": true, 00:18:40.837 "nvme_admin": false, 00:18:40.837 "nvme_io": false 00:18:40.837 }, 00:18:40.837 "memory_domains": [ 00:18:40.837 { 00:18:40.837 "dma_device_id": "system", 00:18:40.837 "dma_device_type": 1 00:18:40.837 }, 00:18:40.837 { 00:18:40.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.837 "dma_device_type": 2 00:18:40.837 } 00:18:40.837 ], 00:18:40.837 "driver_specific": {} 00:18:40.837 } 00:18:40.837 ] 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.837 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.097 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.097 "name": "Existed_Raid", 00:18:41.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.097 "strip_size_kb": 0, 00:18:41.097 "state": "configuring", 00:18:41.097 "raid_level": "raid1", 00:18:41.097 "superblock": false, 00:18:41.097 "num_base_bdevs": 4, 00:18:41.097 "num_base_bdevs_discovered": 1, 00:18:41.097 "num_base_bdevs_operational": 4, 00:18:41.097 "base_bdevs_list": [ 00:18:41.097 { 00:18:41.097 "name": "BaseBdev1", 00:18:41.097 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:41.097 "is_configured": true, 00:18:41.097 "data_offset": 0, 00:18:41.097 "data_size": 65536 00:18:41.097 }, 00:18:41.097 { 00:18:41.097 "name": "BaseBdev2", 00:18:41.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.097 "is_configured": false, 00:18:41.097 "data_offset": 0, 00:18:41.097 "data_size": 0 00:18:41.097 }, 00:18:41.097 { 00:18:41.097 "name": "BaseBdev3", 00:18:41.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.097 "is_configured": false, 00:18:41.097 "data_offset": 0, 00:18:41.097 "data_size": 0 00:18:41.097 }, 00:18:41.097 { 00:18:41.097 "name": "BaseBdev4", 00:18:41.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.098 "is_configured": false, 00:18:41.098 "data_offset": 0, 00:18:41.098 "data_size": 0 00:18:41.098 } 00:18:41.098 ] 00:18:41.098 }' 00:18:41.098 10:14:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.098 10:14:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.668 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:41.668 [2024-06-10 10:14:03.479755] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:41.668 [2024-06-10 10:14:03.479781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa73b0 name Existed_Raid, state configuring 00:18:41.668 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:41.929 [2024-06-10 10:14:03.668261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:41.929 [2024-06-10 10:14:03.669411] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:41.929 [2024-06-10 10:14:03.669436] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:41.929 [2024-06-10 10:14:03.669442] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:41.929 [2024-06-10 10:14:03.669447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:41.929 [2024-06-10 10:14:03.669452] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:41.929 [2024-06-10 10:14:03.669457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.929 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.190 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.190 "name": "Existed_Raid", 00:18:42.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.190 "strip_size_kb": 0, 00:18:42.190 "state": "configuring", 00:18:42.190 "raid_level": "raid1", 00:18:42.190 "superblock": false, 00:18:42.190 "num_base_bdevs": 4, 00:18:42.190 "num_base_bdevs_discovered": 1, 00:18:42.190 "num_base_bdevs_operational": 4, 00:18:42.190 "base_bdevs_list": [ 00:18:42.190 { 00:18:42.190 "name": "BaseBdev1", 00:18:42.190 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:42.190 "is_configured": true, 00:18:42.190 "data_offset": 0, 00:18:42.190 "data_size": 65536 00:18:42.190 }, 00:18:42.190 { 00:18:42.190 "name": "BaseBdev2", 00:18:42.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.190 "is_configured": false, 00:18:42.190 "data_offset": 0, 00:18:42.190 "data_size": 0 00:18:42.190 }, 00:18:42.190 { 00:18:42.190 "name": "BaseBdev3", 00:18:42.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.190 "is_configured": false, 00:18:42.190 "data_offset": 0, 00:18:42.190 "data_size": 0 00:18:42.190 }, 00:18:42.190 { 00:18:42.190 "name": "BaseBdev4", 00:18:42.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.190 "is_configured": false, 00:18:42.190 "data_offset": 0, 00:18:42.190 "data_size": 0 00:18:42.190 } 00:18:42.190 ] 00:18:42.190 }' 00:18:42.190 10:14:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.190 10:14:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:42.814 [2024-06-10 10:14:04.591527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:42.814 BaseBdev2 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:42.814 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.099 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:43.360 [ 00:18:43.360 { 00:18:43.360 "name": "BaseBdev2", 00:18:43.360 "aliases": [ 00:18:43.360 "e13b22e8-c219-41a7-b9fe-4dce869dc949" 00:18:43.360 ], 00:18:43.360 "product_name": "Malloc disk", 00:18:43.360 "block_size": 512, 00:18:43.360 "num_blocks": 65536, 00:18:43.360 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:43.360 "assigned_rate_limits": { 00:18:43.360 "rw_ios_per_sec": 0, 00:18:43.360 "rw_mbytes_per_sec": 0, 00:18:43.360 "r_mbytes_per_sec": 0, 00:18:43.360 "w_mbytes_per_sec": 0 00:18:43.360 }, 00:18:43.360 "claimed": true, 00:18:43.360 "claim_type": "exclusive_write", 00:18:43.360 "zoned": false, 00:18:43.360 "supported_io_types": { 00:18:43.360 "read": true, 00:18:43.360 "write": true, 00:18:43.360 "unmap": true, 00:18:43.360 "write_zeroes": true, 00:18:43.360 "flush": true, 00:18:43.360 "reset": true, 00:18:43.360 "compare": false, 00:18:43.360 "compare_and_write": false, 00:18:43.360 "abort": true, 00:18:43.360 "nvme_admin": false, 00:18:43.360 "nvme_io": false 00:18:43.360 }, 00:18:43.360 "memory_domains": [ 00:18:43.360 { 00:18:43.360 "dma_device_id": "system", 00:18:43.360 "dma_device_type": 1 00:18:43.360 }, 00:18:43.360 { 00:18:43.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.360 "dma_device_type": 2 00:18:43.360 } 00:18:43.360 ], 00:18:43.360 "driver_specific": {} 00:18:43.360 } 00:18:43.360 ] 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.360 10:14:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.360 10:14:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.360 "name": "Existed_Raid", 00:18:43.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.360 "strip_size_kb": 0, 00:18:43.360 "state": "configuring", 00:18:43.360 "raid_level": "raid1", 00:18:43.360 "superblock": false, 00:18:43.360 "num_base_bdevs": 4, 00:18:43.360 "num_base_bdevs_discovered": 2, 00:18:43.360 "num_base_bdevs_operational": 4, 00:18:43.360 "base_bdevs_list": [ 00:18:43.360 { 00:18:43.360 "name": "BaseBdev1", 00:18:43.360 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:43.360 "is_configured": true, 00:18:43.360 "data_offset": 0, 00:18:43.360 "data_size": 65536 00:18:43.360 }, 00:18:43.360 { 00:18:43.360 "name": "BaseBdev2", 00:18:43.360 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:43.360 "is_configured": true, 00:18:43.360 "data_offset": 0, 00:18:43.360 "data_size": 65536 00:18:43.360 }, 00:18:43.360 { 00:18:43.360 "name": "BaseBdev3", 00:18:43.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.360 "is_configured": false, 00:18:43.360 "data_offset": 0, 00:18:43.360 "data_size": 0 00:18:43.360 }, 00:18:43.360 { 00:18:43.360 "name": "BaseBdev4", 00:18:43.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.360 "is_configured": false, 00:18:43.360 "data_offset": 0, 00:18:43.360 "data_size": 0 00:18:43.360 } 00:18:43.360 ] 00:18:43.360 }' 00:18:43.360 10:14:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.360 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.932 10:14:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:44.193 [2024-06-10 10:14:05.867735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:44.193 BaseBdev3 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:44.193 10:14:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:44.454 [ 00:18:44.454 { 00:18:44.454 "name": "BaseBdev3", 00:18:44.454 "aliases": [ 00:18:44.454 "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d" 00:18:44.454 ], 00:18:44.454 "product_name": "Malloc disk", 00:18:44.454 "block_size": 512, 00:18:44.454 "num_blocks": 65536, 00:18:44.454 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:44.454 "assigned_rate_limits": { 00:18:44.454 "rw_ios_per_sec": 0, 00:18:44.454 "rw_mbytes_per_sec": 0, 00:18:44.454 "r_mbytes_per_sec": 0, 00:18:44.454 "w_mbytes_per_sec": 0 00:18:44.454 }, 00:18:44.454 "claimed": true, 00:18:44.454 "claim_type": "exclusive_write", 00:18:44.454 "zoned": false, 00:18:44.454 "supported_io_types": { 00:18:44.454 "read": true, 00:18:44.454 "write": true, 00:18:44.454 "unmap": true, 00:18:44.454 "write_zeroes": true, 00:18:44.454 "flush": true, 00:18:44.454 "reset": true, 00:18:44.454 "compare": false, 00:18:44.454 "compare_and_write": false, 00:18:44.454 "abort": true, 00:18:44.454 "nvme_admin": false, 00:18:44.454 "nvme_io": false 00:18:44.454 }, 00:18:44.454 "memory_domains": [ 00:18:44.454 { 00:18:44.454 "dma_device_id": "system", 00:18:44.454 "dma_device_type": 1 00:18:44.454 }, 00:18:44.454 { 00:18:44.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.454 "dma_device_type": 2 00:18:44.454 } 00:18:44.454 ], 00:18:44.454 "driver_specific": {} 00:18:44.454 } 00:18:44.454 ] 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.454 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.715 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.715 "name": "Existed_Raid", 00:18:44.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.715 "strip_size_kb": 0, 00:18:44.715 "state": "configuring", 00:18:44.715 "raid_level": "raid1", 00:18:44.715 "superblock": false, 00:18:44.715 "num_base_bdevs": 4, 00:18:44.715 "num_base_bdevs_discovered": 3, 00:18:44.715 "num_base_bdevs_operational": 4, 00:18:44.715 "base_bdevs_list": [ 00:18:44.715 { 00:18:44.715 "name": "BaseBdev1", 00:18:44.715 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:44.715 "is_configured": true, 00:18:44.715 "data_offset": 0, 00:18:44.715 "data_size": 65536 00:18:44.715 }, 00:18:44.715 { 00:18:44.715 "name": "BaseBdev2", 00:18:44.715 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:44.715 "is_configured": true, 00:18:44.715 "data_offset": 0, 00:18:44.715 "data_size": 65536 00:18:44.715 }, 00:18:44.715 { 00:18:44.715 "name": "BaseBdev3", 00:18:44.715 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:44.715 "is_configured": true, 00:18:44.715 "data_offset": 0, 00:18:44.715 "data_size": 65536 00:18:44.715 }, 00:18:44.715 { 00:18:44.715 "name": "BaseBdev4", 00:18:44.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.715 "is_configured": false, 00:18:44.715 "data_offset": 0, 00:18:44.715 "data_size": 0 00:18:44.715 } 00:18:44.715 ] 00:18:44.715 }' 00:18:44.715 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.715 10:14:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.287 10:14:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:45.287 [2024-06-10 10:14:07.127865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:45.287 [2024-06-10 10:14:07.127890] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa84c0 00:18:45.287 [2024-06-10 10:14:07.127895] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:45.287 [2024-06-10 10:14:07.128037] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa0900 00:18:45.287 [2024-06-10 10:14:07.128135] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa84c0 00:18:45.287 [2024-06-10 10:14:07.128141] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaa84c0 00:18:45.287 [2024-06-10 10:14:07.128260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:45.287 BaseBdev4 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:45.287 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:45.548 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:45.816 [ 00:18:45.816 { 00:18:45.816 "name": "BaseBdev4", 00:18:45.816 "aliases": [ 00:18:45.816 "059ef300-d606-4360-89b5-0e654afcc46b" 00:18:45.816 ], 00:18:45.817 "product_name": "Malloc disk", 00:18:45.817 "block_size": 512, 00:18:45.817 "num_blocks": 65536, 00:18:45.817 "uuid": "059ef300-d606-4360-89b5-0e654afcc46b", 00:18:45.817 "assigned_rate_limits": { 00:18:45.817 "rw_ios_per_sec": 0, 00:18:45.817 "rw_mbytes_per_sec": 0, 00:18:45.817 "r_mbytes_per_sec": 0, 00:18:45.817 "w_mbytes_per_sec": 0 00:18:45.817 }, 00:18:45.817 "claimed": true, 00:18:45.817 "claim_type": "exclusive_write", 00:18:45.817 "zoned": false, 00:18:45.817 "supported_io_types": { 00:18:45.817 "read": true, 00:18:45.817 "write": true, 00:18:45.817 "unmap": true, 00:18:45.817 "write_zeroes": true, 00:18:45.817 "flush": true, 00:18:45.817 "reset": true, 00:18:45.817 "compare": false, 00:18:45.817 "compare_and_write": false, 00:18:45.817 "abort": true, 00:18:45.817 "nvme_admin": false, 00:18:45.817 "nvme_io": false 00:18:45.817 }, 00:18:45.817 "memory_domains": [ 00:18:45.817 { 00:18:45.817 "dma_device_id": "system", 00:18:45.817 "dma_device_type": 1 00:18:45.817 }, 00:18:45.817 { 00:18:45.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.817 "dma_device_type": 2 00:18:45.817 } 00:18:45.817 ], 00:18:45.817 "driver_specific": {} 00:18:45.817 } 00:18:45.817 ] 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.817 "name": "Existed_Raid", 00:18:45.817 "uuid": "493f663b-0feb-4385-88e3-220f79966435", 00:18:45.817 "strip_size_kb": 0, 00:18:45.817 "state": "online", 00:18:45.817 "raid_level": "raid1", 00:18:45.817 "superblock": false, 00:18:45.817 "num_base_bdevs": 4, 00:18:45.817 "num_base_bdevs_discovered": 4, 00:18:45.817 "num_base_bdevs_operational": 4, 00:18:45.817 "base_bdevs_list": [ 00:18:45.817 { 00:18:45.817 "name": "BaseBdev1", 00:18:45.817 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:45.817 "is_configured": true, 00:18:45.817 "data_offset": 0, 00:18:45.817 "data_size": 65536 00:18:45.817 }, 00:18:45.817 { 00:18:45.817 "name": "BaseBdev2", 00:18:45.817 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:45.817 "is_configured": true, 00:18:45.817 "data_offset": 0, 00:18:45.817 "data_size": 65536 00:18:45.817 }, 00:18:45.817 { 00:18:45.817 "name": "BaseBdev3", 00:18:45.817 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:45.817 "is_configured": true, 00:18:45.817 "data_offset": 0, 00:18:45.817 "data_size": 65536 00:18:45.817 }, 00:18:45.817 { 00:18:45.817 "name": "BaseBdev4", 00:18:45.817 "uuid": "059ef300-d606-4360-89b5-0e654afcc46b", 00:18:45.817 "is_configured": true, 00:18:45.817 "data_offset": 0, 00:18:45.817 "data_size": 65536 00:18:45.817 } 00:18:45.817 ] 00:18:45.817 }' 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.817 10:14:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:46.388 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:46.648 [2024-06-10 10:14:08.275005] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:46.648 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:46.648 "name": "Existed_Raid", 00:18:46.648 "aliases": [ 00:18:46.648 "493f663b-0feb-4385-88e3-220f79966435" 00:18:46.648 ], 00:18:46.648 "product_name": "Raid Volume", 00:18:46.648 "block_size": 512, 00:18:46.648 "num_blocks": 65536, 00:18:46.648 "uuid": "493f663b-0feb-4385-88e3-220f79966435", 00:18:46.648 "assigned_rate_limits": { 00:18:46.648 "rw_ios_per_sec": 0, 00:18:46.648 "rw_mbytes_per_sec": 0, 00:18:46.648 "r_mbytes_per_sec": 0, 00:18:46.648 "w_mbytes_per_sec": 0 00:18:46.648 }, 00:18:46.648 "claimed": false, 00:18:46.648 "zoned": false, 00:18:46.648 "supported_io_types": { 00:18:46.648 "read": true, 00:18:46.648 "write": true, 00:18:46.648 "unmap": false, 00:18:46.648 "write_zeroes": true, 00:18:46.648 "flush": false, 00:18:46.648 "reset": true, 00:18:46.648 "compare": false, 00:18:46.648 "compare_and_write": false, 00:18:46.648 "abort": false, 00:18:46.648 "nvme_admin": false, 00:18:46.648 "nvme_io": false 00:18:46.648 }, 00:18:46.648 "memory_domains": [ 00:18:46.648 { 00:18:46.648 "dma_device_id": "system", 00:18:46.648 "dma_device_type": 1 00:18:46.648 }, 00:18:46.648 { 00:18:46.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.649 "dma_device_type": 2 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "system", 00:18:46.649 "dma_device_type": 1 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.649 "dma_device_type": 2 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "system", 00:18:46.649 "dma_device_type": 1 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.649 "dma_device_type": 2 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "system", 00:18:46.649 "dma_device_type": 1 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.649 "dma_device_type": 2 00:18:46.649 } 00:18:46.649 ], 00:18:46.649 "driver_specific": { 00:18:46.649 "raid": { 00:18:46.649 "uuid": "493f663b-0feb-4385-88e3-220f79966435", 00:18:46.649 "strip_size_kb": 0, 00:18:46.649 "state": "online", 00:18:46.649 "raid_level": "raid1", 00:18:46.649 "superblock": false, 00:18:46.649 "num_base_bdevs": 4, 00:18:46.649 "num_base_bdevs_discovered": 4, 00:18:46.649 "num_base_bdevs_operational": 4, 00:18:46.649 "base_bdevs_list": [ 00:18:46.649 { 00:18:46.649 "name": "BaseBdev1", 00:18:46.649 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:46.649 "is_configured": true, 00:18:46.649 "data_offset": 0, 00:18:46.649 "data_size": 65536 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "name": "BaseBdev2", 00:18:46.649 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:46.649 "is_configured": true, 00:18:46.649 "data_offset": 0, 00:18:46.649 "data_size": 65536 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "name": "BaseBdev3", 00:18:46.649 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:46.649 "is_configured": true, 00:18:46.649 "data_offset": 0, 00:18:46.649 "data_size": 65536 00:18:46.649 }, 00:18:46.649 { 00:18:46.649 "name": "BaseBdev4", 00:18:46.649 "uuid": "059ef300-d606-4360-89b5-0e654afcc46b", 00:18:46.649 "is_configured": true, 00:18:46.649 "data_offset": 0, 00:18:46.649 "data_size": 65536 00:18:46.649 } 00:18:46.649 ] 00:18:46.649 } 00:18:46.649 } 00:18:46.649 }' 00:18:46.649 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:46.649 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:46.649 BaseBdev2 00:18:46.649 BaseBdev3 00:18:46.649 BaseBdev4' 00:18:46.649 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.649 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:46.649 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.911 "name": "BaseBdev1", 00:18:46.911 "aliases": [ 00:18:46.911 "cce94509-f797-4214-9ef6-cac865ea03c2" 00:18:46.911 ], 00:18:46.911 "product_name": "Malloc disk", 00:18:46.911 "block_size": 512, 00:18:46.911 "num_blocks": 65536, 00:18:46.911 "uuid": "cce94509-f797-4214-9ef6-cac865ea03c2", 00:18:46.911 "assigned_rate_limits": { 00:18:46.911 "rw_ios_per_sec": 0, 00:18:46.911 "rw_mbytes_per_sec": 0, 00:18:46.911 "r_mbytes_per_sec": 0, 00:18:46.911 "w_mbytes_per_sec": 0 00:18:46.911 }, 00:18:46.911 "claimed": true, 00:18:46.911 "claim_type": "exclusive_write", 00:18:46.911 "zoned": false, 00:18:46.911 "supported_io_types": { 00:18:46.911 "read": true, 00:18:46.911 "write": true, 00:18:46.911 "unmap": true, 00:18:46.911 "write_zeroes": true, 00:18:46.911 "flush": true, 00:18:46.911 "reset": true, 00:18:46.911 "compare": false, 00:18:46.911 "compare_and_write": false, 00:18:46.911 "abort": true, 00:18:46.911 "nvme_admin": false, 00:18:46.911 "nvme_io": false 00:18:46.911 }, 00:18:46.911 "memory_domains": [ 00:18:46.911 { 00:18:46.911 "dma_device_id": "system", 00:18:46.911 "dma_device_type": 1 00:18:46.911 }, 00:18:46.911 { 00:18:46.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.911 "dma_device_type": 2 00:18:46.911 } 00:18:46.911 ], 00:18:46.911 "driver_specific": {} 00:18:46.911 }' 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.911 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:47.171 10:14:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.171 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.171 "name": "BaseBdev2", 00:18:47.171 "aliases": [ 00:18:47.171 "e13b22e8-c219-41a7-b9fe-4dce869dc949" 00:18:47.171 ], 00:18:47.171 "product_name": "Malloc disk", 00:18:47.171 "block_size": 512, 00:18:47.171 "num_blocks": 65536, 00:18:47.171 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:47.171 "assigned_rate_limits": { 00:18:47.171 "rw_ios_per_sec": 0, 00:18:47.171 "rw_mbytes_per_sec": 0, 00:18:47.171 "r_mbytes_per_sec": 0, 00:18:47.171 "w_mbytes_per_sec": 0 00:18:47.171 }, 00:18:47.171 "claimed": true, 00:18:47.172 "claim_type": "exclusive_write", 00:18:47.172 "zoned": false, 00:18:47.172 "supported_io_types": { 00:18:47.172 "read": true, 00:18:47.172 "write": true, 00:18:47.172 "unmap": true, 00:18:47.172 "write_zeroes": true, 00:18:47.172 "flush": true, 00:18:47.172 "reset": true, 00:18:47.172 "compare": false, 00:18:47.172 "compare_and_write": false, 00:18:47.172 "abort": true, 00:18:47.172 "nvme_admin": false, 00:18:47.172 "nvme_io": false 00:18:47.172 }, 00:18:47.172 "memory_domains": [ 00:18:47.172 { 00:18:47.172 "dma_device_id": "system", 00:18:47.172 "dma_device_type": 1 00:18:47.172 }, 00:18:47.172 { 00:18:47.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.172 "dma_device_type": 2 00:18:47.172 } 00:18:47.172 ], 00:18:47.172 "driver_specific": {} 00:18:47.172 }' 00:18:47.172 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.432 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.692 "name": "BaseBdev3", 00:18:47.692 "aliases": [ 00:18:47.692 "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d" 00:18:47.692 ], 00:18:47.692 "product_name": "Malloc disk", 00:18:47.692 "block_size": 512, 00:18:47.692 "num_blocks": 65536, 00:18:47.692 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:47.692 "assigned_rate_limits": { 00:18:47.692 "rw_ios_per_sec": 0, 00:18:47.692 "rw_mbytes_per_sec": 0, 00:18:47.692 "r_mbytes_per_sec": 0, 00:18:47.692 "w_mbytes_per_sec": 0 00:18:47.692 }, 00:18:47.692 "claimed": true, 00:18:47.692 "claim_type": "exclusive_write", 00:18:47.692 "zoned": false, 00:18:47.692 "supported_io_types": { 00:18:47.692 "read": true, 00:18:47.692 "write": true, 00:18:47.692 "unmap": true, 00:18:47.692 "write_zeroes": true, 00:18:47.692 "flush": true, 00:18:47.692 "reset": true, 00:18:47.692 "compare": false, 00:18:47.692 "compare_and_write": false, 00:18:47.692 "abort": true, 00:18:47.692 "nvme_admin": false, 00:18:47.692 "nvme_io": false 00:18:47.692 }, 00:18:47.692 "memory_domains": [ 00:18:47.692 { 00:18:47.692 "dma_device_id": "system", 00:18:47.692 "dma_device_type": 1 00:18:47.692 }, 00:18:47.692 { 00:18:47.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.692 "dma_device_type": 2 00:18:47.692 } 00:18:47.692 ], 00:18:47.692 "driver_specific": {} 00:18:47.692 }' 00:18:47.692 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.953 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:48.214 10:14:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.474 "name": "BaseBdev4", 00:18:48.474 "aliases": [ 00:18:48.474 "059ef300-d606-4360-89b5-0e654afcc46b" 00:18:48.474 ], 00:18:48.474 "product_name": "Malloc disk", 00:18:48.474 "block_size": 512, 00:18:48.474 "num_blocks": 65536, 00:18:48.474 "uuid": "059ef300-d606-4360-89b5-0e654afcc46b", 00:18:48.474 "assigned_rate_limits": { 00:18:48.474 "rw_ios_per_sec": 0, 00:18:48.474 "rw_mbytes_per_sec": 0, 00:18:48.474 "r_mbytes_per_sec": 0, 00:18:48.474 "w_mbytes_per_sec": 0 00:18:48.474 }, 00:18:48.474 "claimed": true, 00:18:48.474 "claim_type": "exclusive_write", 00:18:48.474 "zoned": false, 00:18:48.474 "supported_io_types": { 00:18:48.474 "read": true, 00:18:48.474 "write": true, 00:18:48.474 "unmap": true, 00:18:48.474 "write_zeroes": true, 00:18:48.474 "flush": true, 00:18:48.474 "reset": true, 00:18:48.474 "compare": false, 00:18:48.474 "compare_and_write": false, 00:18:48.474 "abort": true, 00:18:48.474 "nvme_admin": false, 00:18:48.474 "nvme_io": false 00:18:48.474 }, 00:18:48.474 "memory_domains": [ 00:18:48.474 { 00:18:48.474 "dma_device_id": "system", 00:18:48.474 "dma_device_type": 1 00:18:48.474 }, 00:18:48.474 { 00:18:48.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.474 "dma_device_type": 2 00:18:48.474 } 00:18:48.474 ], 00:18:48.474 "driver_specific": {} 00:18:48.474 }' 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.474 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:48.735 [2024-06-10 10:14:10.560590] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.735 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.996 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.996 "name": "Existed_Raid", 00:18:48.996 "uuid": "493f663b-0feb-4385-88e3-220f79966435", 00:18:48.996 "strip_size_kb": 0, 00:18:48.996 "state": "online", 00:18:48.996 "raid_level": "raid1", 00:18:48.996 "superblock": false, 00:18:48.996 "num_base_bdevs": 4, 00:18:48.996 "num_base_bdevs_discovered": 3, 00:18:48.996 "num_base_bdevs_operational": 3, 00:18:48.996 "base_bdevs_list": [ 00:18:48.996 { 00:18:48.996 "name": null, 00:18:48.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.996 "is_configured": false, 00:18:48.996 "data_offset": 0, 00:18:48.996 "data_size": 65536 00:18:48.996 }, 00:18:48.996 { 00:18:48.996 "name": "BaseBdev2", 00:18:48.996 "uuid": "e13b22e8-c219-41a7-b9fe-4dce869dc949", 00:18:48.996 "is_configured": true, 00:18:48.996 "data_offset": 0, 00:18:48.996 "data_size": 65536 00:18:48.996 }, 00:18:48.996 { 00:18:48.996 "name": "BaseBdev3", 00:18:48.996 "uuid": "5e7d67e8-2ba5-431c-840a-1c1dc30dc19d", 00:18:48.996 "is_configured": true, 00:18:48.996 "data_offset": 0, 00:18:48.996 "data_size": 65536 00:18:48.996 }, 00:18:48.996 { 00:18:48.996 "name": "BaseBdev4", 00:18:48.996 "uuid": "059ef300-d606-4360-89b5-0e654afcc46b", 00:18:48.996 "is_configured": true, 00:18:48.996 "data_offset": 0, 00:18:48.996 "data_size": 65536 00:18:48.996 } 00:18:48.997 ] 00:18:48.997 }' 00:18:48.997 10:14:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.997 10:14:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.566 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:49.566 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:49.566 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.567 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:49.827 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:49.827 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:49.827 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:50.088 [2024-06-10 10:14:11.695472] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:50.088 10:14:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:50.349 [2024-06-10 10:14:12.082296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:50.349 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:50.349 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:50.349 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.349 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:50.609 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:50.609 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:50.609 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:50.609 [2024-06-10 10:14:12.469036] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:50.609 [2024-06-10 10:14:12.469092] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:50.609 [2024-06-10 10:14:12.475071] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:50.609 [2024-06-10 10:14:12.475093] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:50.609 [2024-06-10 10:14:12.475100] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa84c0 name Existed_Raid, state offline 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:50.870 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:51.130 BaseBdev2 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:51.130 10:14:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.392 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:51.392 [ 00:18:51.392 { 00:18:51.392 "name": "BaseBdev2", 00:18:51.392 "aliases": [ 00:18:51.392 "0adef175-26c0-4cab-9d35-1fe315288a68" 00:18:51.392 ], 00:18:51.392 "product_name": "Malloc disk", 00:18:51.392 "block_size": 512, 00:18:51.392 "num_blocks": 65536, 00:18:51.392 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:51.392 "assigned_rate_limits": { 00:18:51.392 "rw_ios_per_sec": 0, 00:18:51.392 "rw_mbytes_per_sec": 0, 00:18:51.392 "r_mbytes_per_sec": 0, 00:18:51.392 "w_mbytes_per_sec": 0 00:18:51.392 }, 00:18:51.392 "claimed": false, 00:18:51.392 "zoned": false, 00:18:51.392 "supported_io_types": { 00:18:51.392 "read": true, 00:18:51.392 "write": true, 00:18:51.392 "unmap": true, 00:18:51.392 "write_zeroes": true, 00:18:51.392 "flush": true, 00:18:51.392 "reset": true, 00:18:51.392 "compare": false, 00:18:51.392 "compare_and_write": false, 00:18:51.392 "abort": true, 00:18:51.392 "nvme_admin": false, 00:18:51.392 "nvme_io": false 00:18:51.392 }, 00:18:51.392 "memory_domains": [ 00:18:51.392 { 00:18:51.392 "dma_device_id": "system", 00:18:51.392 "dma_device_type": 1 00:18:51.392 }, 00:18:51.392 { 00:18:51.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.392 "dma_device_type": 2 00:18:51.392 } 00:18:51.392 ], 00:18:51.392 "driver_specific": {} 00:18:51.392 } 00:18:51.392 ] 00:18:51.392 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:51.392 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:51.392 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:51.392 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:51.653 BaseBdev3 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:51.653 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.913 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:52.174 [ 00:18:52.174 { 00:18:52.174 "name": "BaseBdev3", 00:18:52.174 "aliases": [ 00:18:52.174 "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82" 00:18:52.174 ], 00:18:52.174 "product_name": "Malloc disk", 00:18:52.174 "block_size": 512, 00:18:52.174 "num_blocks": 65536, 00:18:52.174 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:52.174 "assigned_rate_limits": { 00:18:52.174 "rw_ios_per_sec": 0, 00:18:52.174 "rw_mbytes_per_sec": 0, 00:18:52.174 "r_mbytes_per_sec": 0, 00:18:52.174 "w_mbytes_per_sec": 0 00:18:52.174 }, 00:18:52.174 "claimed": false, 00:18:52.174 "zoned": false, 00:18:52.174 "supported_io_types": { 00:18:52.174 "read": true, 00:18:52.174 "write": true, 00:18:52.174 "unmap": true, 00:18:52.174 "write_zeroes": true, 00:18:52.174 "flush": true, 00:18:52.174 "reset": true, 00:18:52.174 "compare": false, 00:18:52.174 "compare_and_write": false, 00:18:52.174 "abort": true, 00:18:52.174 "nvme_admin": false, 00:18:52.174 "nvme_io": false 00:18:52.174 }, 00:18:52.174 "memory_domains": [ 00:18:52.174 { 00:18:52.174 "dma_device_id": "system", 00:18:52.174 "dma_device_type": 1 00:18:52.174 }, 00:18:52.174 { 00:18:52.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.174 "dma_device_type": 2 00:18:52.174 } 00:18:52.174 ], 00:18:52.174 "driver_specific": {} 00:18:52.174 } 00:18:52.174 ] 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:52.174 BaseBdev4 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:52.174 10:14:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:52.434 10:14:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:52.694 [ 00:18:52.694 { 00:18:52.694 "name": "BaseBdev4", 00:18:52.694 "aliases": [ 00:18:52.694 "5002d29f-1daa-47fb-ac72-3d8727938256" 00:18:52.694 ], 00:18:52.694 "product_name": "Malloc disk", 00:18:52.694 "block_size": 512, 00:18:52.694 "num_blocks": 65536, 00:18:52.694 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:52.694 "assigned_rate_limits": { 00:18:52.694 "rw_ios_per_sec": 0, 00:18:52.694 "rw_mbytes_per_sec": 0, 00:18:52.694 "r_mbytes_per_sec": 0, 00:18:52.694 "w_mbytes_per_sec": 0 00:18:52.694 }, 00:18:52.694 "claimed": false, 00:18:52.694 "zoned": false, 00:18:52.694 "supported_io_types": { 00:18:52.694 "read": true, 00:18:52.694 "write": true, 00:18:52.694 "unmap": true, 00:18:52.694 "write_zeroes": true, 00:18:52.694 "flush": true, 00:18:52.694 "reset": true, 00:18:52.694 "compare": false, 00:18:52.694 "compare_and_write": false, 00:18:52.694 "abort": true, 00:18:52.694 "nvme_admin": false, 00:18:52.694 "nvme_io": false 00:18:52.694 }, 00:18:52.694 "memory_domains": [ 00:18:52.694 { 00:18:52.694 "dma_device_id": "system", 00:18:52.694 "dma_device_type": 1 00:18:52.694 }, 00:18:52.694 { 00:18:52.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.694 "dma_device_type": 2 00:18:52.694 } 00:18:52.694 ], 00:18:52.694 "driver_specific": {} 00:18:52.694 } 00:18:52.694 ] 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:52.694 [2024-06-10 10:14:14.504047] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:52.694 [2024-06-10 10:14:14.504073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:52.694 [2024-06-10 10:14:14.504090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:52.694 [2024-06-10 10:14:14.505099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:52.694 [2024-06-10 10:14:14.505129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:52.694 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.695 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.955 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.955 "name": "Existed_Raid", 00:18:52.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.955 "strip_size_kb": 0, 00:18:52.955 "state": "configuring", 00:18:52.955 "raid_level": "raid1", 00:18:52.955 "superblock": false, 00:18:52.955 "num_base_bdevs": 4, 00:18:52.955 "num_base_bdevs_discovered": 3, 00:18:52.955 "num_base_bdevs_operational": 4, 00:18:52.955 "base_bdevs_list": [ 00:18:52.955 { 00:18:52.955 "name": "BaseBdev1", 00:18:52.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.955 "is_configured": false, 00:18:52.955 "data_offset": 0, 00:18:52.955 "data_size": 0 00:18:52.955 }, 00:18:52.955 { 00:18:52.955 "name": "BaseBdev2", 00:18:52.955 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:52.955 "is_configured": true, 00:18:52.955 "data_offset": 0, 00:18:52.955 "data_size": 65536 00:18:52.955 }, 00:18:52.955 { 00:18:52.955 "name": "BaseBdev3", 00:18:52.955 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:52.955 "is_configured": true, 00:18:52.955 "data_offset": 0, 00:18:52.955 "data_size": 65536 00:18:52.955 }, 00:18:52.955 { 00:18:52.955 "name": "BaseBdev4", 00:18:52.955 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:52.955 "is_configured": true, 00:18:52.955 "data_offset": 0, 00:18:52.955 "data_size": 65536 00:18:52.955 } 00:18:52.955 ] 00:18:52.955 }' 00:18:52.955 10:14:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.955 10:14:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.527 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:53.527 [2024-06-10 10:14:15.378237] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.787 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.788 "name": "Existed_Raid", 00:18:53.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.788 "strip_size_kb": 0, 00:18:53.788 "state": "configuring", 00:18:53.788 "raid_level": "raid1", 00:18:53.788 "superblock": false, 00:18:53.788 "num_base_bdevs": 4, 00:18:53.788 "num_base_bdevs_discovered": 2, 00:18:53.788 "num_base_bdevs_operational": 4, 00:18:53.788 "base_bdevs_list": [ 00:18:53.788 { 00:18:53.788 "name": "BaseBdev1", 00:18:53.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.788 "is_configured": false, 00:18:53.788 "data_offset": 0, 00:18:53.788 "data_size": 0 00:18:53.788 }, 00:18:53.788 { 00:18:53.788 "name": null, 00:18:53.788 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:53.788 "is_configured": false, 00:18:53.788 "data_offset": 0, 00:18:53.788 "data_size": 65536 00:18:53.788 }, 00:18:53.788 { 00:18:53.788 "name": "BaseBdev3", 00:18:53.788 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:53.788 "is_configured": true, 00:18:53.788 "data_offset": 0, 00:18:53.788 "data_size": 65536 00:18:53.788 }, 00:18:53.788 { 00:18:53.788 "name": "BaseBdev4", 00:18:53.788 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:53.788 "is_configured": true, 00:18:53.788 "data_offset": 0, 00:18:53.788 "data_size": 65536 00:18:53.788 } 00:18:53.788 ] 00:18:53.788 }' 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.788 10:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.358 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.358 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:54.618 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:54.618 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:54.878 [2024-06-10 10:14:16.501793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:54.878 BaseBdev1 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:54.878 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:55.139 [ 00:18:55.139 { 00:18:55.139 "name": "BaseBdev1", 00:18:55.139 "aliases": [ 00:18:55.139 "01de2f14-50b5-4245-839d-fbf690dd2ce4" 00:18:55.139 ], 00:18:55.139 "product_name": "Malloc disk", 00:18:55.139 "block_size": 512, 00:18:55.139 "num_blocks": 65536, 00:18:55.139 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:55.139 "assigned_rate_limits": { 00:18:55.139 "rw_ios_per_sec": 0, 00:18:55.139 "rw_mbytes_per_sec": 0, 00:18:55.139 "r_mbytes_per_sec": 0, 00:18:55.139 "w_mbytes_per_sec": 0 00:18:55.139 }, 00:18:55.139 "claimed": true, 00:18:55.139 "claim_type": "exclusive_write", 00:18:55.139 "zoned": false, 00:18:55.139 "supported_io_types": { 00:18:55.139 "read": true, 00:18:55.139 "write": true, 00:18:55.139 "unmap": true, 00:18:55.139 "write_zeroes": true, 00:18:55.139 "flush": true, 00:18:55.139 "reset": true, 00:18:55.139 "compare": false, 00:18:55.139 "compare_and_write": false, 00:18:55.139 "abort": true, 00:18:55.139 "nvme_admin": false, 00:18:55.139 "nvme_io": false 00:18:55.139 }, 00:18:55.139 "memory_domains": [ 00:18:55.139 { 00:18:55.139 "dma_device_id": "system", 00:18:55.139 "dma_device_type": 1 00:18:55.139 }, 00:18:55.139 { 00:18:55.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.139 "dma_device_type": 2 00:18:55.139 } 00:18:55.139 ], 00:18:55.139 "driver_specific": {} 00:18:55.139 } 00:18:55.139 ] 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.139 10:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:55.399 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.399 "name": "Existed_Raid", 00:18:55.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.399 "strip_size_kb": 0, 00:18:55.399 "state": "configuring", 00:18:55.399 "raid_level": "raid1", 00:18:55.399 "superblock": false, 00:18:55.399 "num_base_bdevs": 4, 00:18:55.399 "num_base_bdevs_discovered": 3, 00:18:55.399 "num_base_bdevs_operational": 4, 00:18:55.399 "base_bdevs_list": [ 00:18:55.399 { 00:18:55.399 "name": "BaseBdev1", 00:18:55.399 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:55.399 "is_configured": true, 00:18:55.399 "data_offset": 0, 00:18:55.399 "data_size": 65536 00:18:55.399 }, 00:18:55.399 { 00:18:55.399 "name": null, 00:18:55.399 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:55.399 "is_configured": false, 00:18:55.399 "data_offset": 0, 00:18:55.399 "data_size": 65536 00:18:55.399 }, 00:18:55.399 { 00:18:55.399 "name": "BaseBdev3", 00:18:55.399 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:55.399 "is_configured": true, 00:18:55.399 "data_offset": 0, 00:18:55.399 "data_size": 65536 00:18:55.399 }, 00:18:55.399 { 00:18:55.399 "name": "BaseBdev4", 00:18:55.399 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:55.399 "is_configured": true, 00:18:55.399 "data_offset": 0, 00:18:55.399 "data_size": 65536 00:18:55.399 } 00:18:55.399 ] 00:18:55.399 }' 00:18:55.399 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.399 10:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.970 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.970 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:55.970 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:55.970 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:56.230 [2024-06-10 10:14:17.921404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.230 10:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.490 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.490 "name": "Existed_Raid", 00:18:56.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.490 "strip_size_kb": 0, 00:18:56.490 "state": "configuring", 00:18:56.490 "raid_level": "raid1", 00:18:56.490 "superblock": false, 00:18:56.490 "num_base_bdevs": 4, 00:18:56.490 "num_base_bdevs_discovered": 2, 00:18:56.490 "num_base_bdevs_operational": 4, 00:18:56.490 "base_bdevs_list": [ 00:18:56.490 { 00:18:56.490 "name": "BaseBdev1", 00:18:56.490 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:56.490 "is_configured": true, 00:18:56.490 "data_offset": 0, 00:18:56.490 "data_size": 65536 00:18:56.490 }, 00:18:56.490 { 00:18:56.490 "name": null, 00:18:56.490 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:56.490 "is_configured": false, 00:18:56.490 "data_offset": 0, 00:18:56.490 "data_size": 65536 00:18:56.490 }, 00:18:56.490 { 00:18:56.490 "name": null, 00:18:56.490 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:56.490 "is_configured": false, 00:18:56.490 "data_offset": 0, 00:18:56.490 "data_size": 65536 00:18:56.490 }, 00:18:56.490 { 00:18:56.490 "name": "BaseBdev4", 00:18:56.490 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:56.490 "is_configured": true, 00:18:56.490 "data_offset": 0, 00:18:56.490 "data_size": 65536 00:18:56.490 } 00:18:56.490 ] 00:18:56.490 }' 00:18:56.490 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.490 10:14:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.060 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:57.060 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.060 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:57.060 10:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:57.320 [2024-06-10 10:14:19.028220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.320 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:57.581 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.581 "name": "Existed_Raid", 00:18:57.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.581 "strip_size_kb": 0, 00:18:57.581 "state": "configuring", 00:18:57.581 "raid_level": "raid1", 00:18:57.581 "superblock": false, 00:18:57.581 "num_base_bdevs": 4, 00:18:57.581 "num_base_bdevs_discovered": 3, 00:18:57.581 "num_base_bdevs_operational": 4, 00:18:57.581 "base_bdevs_list": [ 00:18:57.581 { 00:18:57.581 "name": "BaseBdev1", 00:18:57.581 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:57.581 "is_configured": true, 00:18:57.581 "data_offset": 0, 00:18:57.581 "data_size": 65536 00:18:57.581 }, 00:18:57.581 { 00:18:57.581 "name": null, 00:18:57.581 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:57.581 "is_configured": false, 00:18:57.581 "data_offset": 0, 00:18:57.581 "data_size": 65536 00:18:57.581 }, 00:18:57.581 { 00:18:57.581 "name": "BaseBdev3", 00:18:57.581 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:57.581 "is_configured": true, 00:18:57.581 "data_offset": 0, 00:18:57.581 "data_size": 65536 00:18:57.581 }, 00:18:57.581 { 00:18:57.581 "name": "BaseBdev4", 00:18:57.581 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:57.581 "is_configured": true, 00:18:57.581 "data_offset": 0, 00:18:57.581 "data_size": 65536 00:18:57.581 } 00:18:57.581 ] 00:18:57.581 }' 00:18:57.581 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.581 10:14:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.151 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.151 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:58.151 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:58.151 10:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:58.411 [2024-06-10 10:14:20.151180] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.411 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:58.671 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.671 "name": "Existed_Raid", 00:18:58.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.671 "strip_size_kb": 0, 00:18:58.671 "state": "configuring", 00:18:58.671 "raid_level": "raid1", 00:18:58.671 "superblock": false, 00:18:58.671 "num_base_bdevs": 4, 00:18:58.671 "num_base_bdevs_discovered": 2, 00:18:58.671 "num_base_bdevs_operational": 4, 00:18:58.671 "base_bdevs_list": [ 00:18:58.671 { 00:18:58.671 "name": null, 00:18:58.671 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:58.671 "is_configured": false, 00:18:58.671 "data_offset": 0, 00:18:58.671 "data_size": 65536 00:18:58.671 }, 00:18:58.671 { 00:18:58.671 "name": null, 00:18:58.671 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:58.671 "is_configured": false, 00:18:58.671 "data_offset": 0, 00:18:58.671 "data_size": 65536 00:18:58.671 }, 00:18:58.671 { 00:18:58.671 "name": "BaseBdev3", 00:18:58.671 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:58.671 "is_configured": true, 00:18:58.671 "data_offset": 0, 00:18:58.671 "data_size": 65536 00:18:58.671 }, 00:18:58.671 { 00:18:58.671 "name": "BaseBdev4", 00:18:58.671 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:58.671 "is_configured": true, 00:18:58.671 "data_offset": 0, 00:18:58.671 "data_size": 65536 00:18:58.671 } 00:18:58.671 ] 00:18:58.671 }' 00:18:58.671 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.671 10:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.242 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.242 10:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:59.242 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:59.242 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:59.502 [2024-06-10 10:14:21.211479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.502 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.762 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.762 "name": "Existed_Raid", 00:18:59.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.762 "strip_size_kb": 0, 00:18:59.762 "state": "configuring", 00:18:59.762 "raid_level": "raid1", 00:18:59.762 "superblock": false, 00:18:59.762 "num_base_bdevs": 4, 00:18:59.762 "num_base_bdevs_discovered": 3, 00:18:59.762 "num_base_bdevs_operational": 4, 00:18:59.762 "base_bdevs_list": [ 00:18:59.762 { 00:18:59.762 "name": null, 00:18:59.762 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:18:59.762 "is_configured": false, 00:18:59.762 "data_offset": 0, 00:18:59.762 "data_size": 65536 00:18:59.762 }, 00:18:59.762 { 00:18:59.762 "name": "BaseBdev2", 00:18:59.762 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:18:59.762 "is_configured": true, 00:18:59.762 "data_offset": 0, 00:18:59.762 "data_size": 65536 00:18:59.762 }, 00:18:59.762 { 00:18:59.762 "name": "BaseBdev3", 00:18:59.762 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:18:59.762 "is_configured": true, 00:18:59.762 "data_offset": 0, 00:18:59.762 "data_size": 65536 00:18:59.762 }, 00:18:59.762 { 00:18:59.762 "name": "BaseBdev4", 00:18:59.762 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:18:59.762 "is_configured": true, 00:18:59.762 "data_offset": 0, 00:18:59.762 "data_size": 65536 00:18:59.762 } 00:18:59.762 ] 00:18:59.762 }' 00:18:59.762 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.762 10:14:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.331 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.331 10:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:00.331 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:00.331 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.331 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:00.590 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 01de2f14-50b5-4245-839d-fbf690dd2ce4 00:19:00.851 [2024-06-10 10:14:22.475567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:00.851 [2024-06-10 10:14:22.475593] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa4f30 00:19:00.851 [2024-06-10 10:14:22.475597] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:00.851 [2024-06-10 10:14:22.475743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7a5240 00:19:00.851 [2024-06-10 10:14:22.475850] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa4f30 00:19:00.851 [2024-06-10 10:14:22.475856] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaa4f30 00:19:00.851 [2024-06-10 10:14:22.475977] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.851 NewBaseBdev 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:00.851 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:01.112 [ 00:19:01.112 { 00:19:01.112 "name": "NewBaseBdev", 00:19:01.112 "aliases": [ 00:19:01.112 "01de2f14-50b5-4245-839d-fbf690dd2ce4" 00:19:01.112 ], 00:19:01.112 "product_name": "Malloc disk", 00:19:01.112 "block_size": 512, 00:19:01.112 "num_blocks": 65536, 00:19:01.112 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:19:01.112 "assigned_rate_limits": { 00:19:01.112 "rw_ios_per_sec": 0, 00:19:01.112 "rw_mbytes_per_sec": 0, 00:19:01.112 "r_mbytes_per_sec": 0, 00:19:01.112 "w_mbytes_per_sec": 0 00:19:01.112 }, 00:19:01.112 "claimed": true, 00:19:01.112 "claim_type": "exclusive_write", 00:19:01.112 "zoned": false, 00:19:01.112 "supported_io_types": { 00:19:01.112 "read": true, 00:19:01.112 "write": true, 00:19:01.112 "unmap": true, 00:19:01.112 "write_zeroes": true, 00:19:01.112 "flush": true, 00:19:01.112 "reset": true, 00:19:01.112 "compare": false, 00:19:01.112 "compare_and_write": false, 00:19:01.112 "abort": true, 00:19:01.112 "nvme_admin": false, 00:19:01.112 "nvme_io": false 00:19:01.112 }, 00:19:01.112 "memory_domains": [ 00:19:01.112 { 00:19:01.112 "dma_device_id": "system", 00:19:01.112 "dma_device_type": 1 00:19:01.112 }, 00:19:01.112 { 00:19:01.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.112 "dma_device_type": 2 00:19:01.112 } 00:19:01.112 ], 00:19:01.112 "driver_specific": {} 00:19:01.112 } 00:19:01.112 ] 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.112 10:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.372 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.372 "name": "Existed_Raid", 00:19:01.372 "uuid": "548f6feb-d76f-4e76-89e6-619d0f46976a", 00:19:01.372 "strip_size_kb": 0, 00:19:01.372 "state": "online", 00:19:01.372 "raid_level": "raid1", 00:19:01.372 "superblock": false, 00:19:01.372 "num_base_bdevs": 4, 00:19:01.372 "num_base_bdevs_discovered": 4, 00:19:01.372 "num_base_bdevs_operational": 4, 00:19:01.372 "base_bdevs_list": [ 00:19:01.372 { 00:19:01.372 "name": "NewBaseBdev", 00:19:01.372 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:19:01.372 "is_configured": true, 00:19:01.372 "data_offset": 0, 00:19:01.372 "data_size": 65536 00:19:01.372 }, 00:19:01.372 { 00:19:01.372 "name": "BaseBdev2", 00:19:01.372 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:19:01.372 "is_configured": true, 00:19:01.372 "data_offset": 0, 00:19:01.372 "data_size": 65536 00:19:01.372 }, 00:19:01.372 { 00:19:01.372 "name": "BaseBdev3", 00:19:01.372 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:19:01.372 "is_configured": true, 00:19:01.372 "data_offset": 0, 00:19:01.372 "data_size": 65536 00:19:01.372 }, 00:19:01.372 { 00:19:01.372 "name": "BaseBdev4", 00:19:01.372 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:19:01.372 "is_configured": true, 00:19:01.372 "data_offset": 0, 00:19:01.372 "data_size": 65536 00:19:01.372 } 00:19:01.372 ] 00:19:01.372 }' 00:19:01.372 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.372 10:14:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:01.632 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:01.892 [2024-06-10 10:14:23.670839] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:01.892 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:01.892 "name": "Existed_Raid", 00:19:01.892 "aliases": [ 00:19:01.892 "548f6feb-d76f-4e76-89e6-619d0f46976a" 00:19:01.892 ], 00:19:01.892 "product_name": "Raid Volume", 00:19:01.892 "block_size": 512, 00:19:01.892 "num_blocks": 65536, 00:19:01.892 "uuid": "548f6feb-d76f-4e76-89e6-619d0f46976a", 00:19:01.892 "assigned_rate_limits": { 00:19:01.892 "rw_ios_per_sec": 0, 00:19:01.892 "rw_mbytes_per_sec": 0, 00:19:01.892 "r_mbytes_per_sec": 0, 00:19:01.892 "w_mbytes_per_sec": 0 00:19:01.892 }, 00:19:01.892 "claimed": false, 00:19:01.892 "zoned": false, 00:19:01.892 "supported_io_types": { 00:19:01.892 "read": true, 00:19:01.892 "write": true, 00:19:01.892 "unmap": false, 00:19:01.892 "write_zeroes": true, 00:19:01.892 "flush": false, 00:19:01.892 "reset": true, 00:19:01.892 "compare": false, 00:19:01.892 "compare_and_write": false, 00:19:01.892 "abort": false, 00:19:01.892 "nvme_admin": false, 00:19:01.892 "nvme_io": false 00:19:01.892 }, 00:19:01.892 "memory_domains": [ 00:19:01.892 { 00:19:01.892 "dma_device_id": "system", 00:19:01.892 "dma_device_type": 1 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.892 "dma_device_type": 2 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "system", 00:19:01.892 "dma_device_type": 1 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.892 "dma_device_type": 2 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "system", 00:19:01.892 "dma_device_type": 1 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.892 "dma_device_type": 2 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "system", 00:19:01.892 "dma_device_type": 1 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.892 "dma_device_type": 2 00:19:01.892 } 00:19:01.892 ], 00:19:01.892 "driver_specific": { 00:19:01.892 "raid": { 00:19:01.892 "uuid": "548f6feb-d76f-4e76-89e6-619d0f46976a", 00:19:01.892 "strip_size_kb": 0, 00:19:01.892 "state": "online", 00:19:01.892 "raid_level": "raid1", 00:19:01.892 "superblock": false, 00:19:01.892 "num_base_bdevs": 4, 00:19:01.892 "num_base_bdevs_discovered": 4, 00:19:01.892 "num_base_bdevs_operational": 4, 00:19:01.892 "base_bdevs_list": [ 00:19:01.892 { 00:19:01.892 "name": "NewBaseBdev", 00:19:01.892 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:19:01.892 "is_configured": true, 00:19:01.892 "data_offset": 0, 00:19:01.892 "data_size": 65536 00:19:01.892 }, 00:19:01.892 { 00:19:01.892 "name": "BaseBdev2", 00:19:01.892 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:19:01.893 "is_configured": true, 00:19:01.893 "data_offset": 0, 00:19:01.893 "data_size": 65536 00:19:01.893 }, 00:19:01.893 { 00:19:01.893 "name": "BaseBdev3", 00:19:01.893 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:19:01.893 "is_configured": true, 00:19:01.893 "data_offset": 0, 00:19:01.893 "data_size": 65536 00:19:01.893 }, 00:19:01.893 { 00:19:01.893 "name": "BaseBdev4", 00:19:01.893 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:19:01.893 "is_configured": true, 00:19:01.893 "data_offset": 0, 00:19:01.893 "data_size": 65536 00:19:01.893 } 00:19:01.893 ] 00:19:01.893 } 00:19:01.893 } 00:19:01.893 }' 00:19:01.893 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:01.893 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:01.893 BaseBdev2 00:19:01.893 BaseBdev3 00:19:01.893 BaseBdev4' 00:19:01.893 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.893 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:01.893 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.152 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.152 "name": "NewBaseBdev", 00:19:02.152 "aliases": [ 00:19:02.152 "01de2f14-50b5-4245-839d-fbf690dd2ce4" 00:19:02.153 ], 00:19:02.153 "product_name": "Malloc disk", 00:19:02.153 "block_size": 512, 00:19:02.153 "num_blocks": 65536, 00:19:02.153 "uuid": "01de2f14-50b5-4245-839d-fbf690dd2ce4", 00:19:02.153 "assigned_rate_limits": { 00:19:02.153 "rw_ios_per_sec": 0, 00:19:02.153 "rw_mbytes_per_sec": 0, 00:19:02.153 "r_mbytes_per_sec": 0, 00:19:02.153 "w_mbytes_per_sec": 0 00:19:02.153 }, 00:19:02.153 "claimed": true, 00:19:02.153 "claim_type": "exclusive_write", 00:19:02.153 "zoned": false, 00:19:02.153 "supported_io_types": { 00:19:02.153 "read": true, 00:19:02.153 "write": true, 00:19:02.153 "unmap": true, 00:19:02.153 "write_zeroes": true, 00:19:02.153 "flush": true, 00:19:02.153 "reset": true, 00:19:02.153 "compare": false, 00:19:02.153 "compare_and_write": false, 00:19:02.153 "abort": true, 00:19:02.153 "nvme_admin": false, 00:19:02.153 "nvme_io": false 00:19:02.153 }, 00:19:02.153 "memory_domains": [ 00:19:02.153 { 00:19:02.153 "dma_device_id": "system", 00:19:02.153 "dma_device_type": 1 00:19:02.153 }, 00:19:02.153 { 00:19:02.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.153 "dma_device_type": 2 00:19:02.153 } 00:19:02.153 ], 00:19:02.153 "driver_specific": {} 00:19:02.153 }' 00:19:02.153 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.153 10:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:02.412 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.672 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.672 "name": "BaseBdev2", 00:19:02.672 "aliases": [ 00:19:02.672 "0adef175-26c0-4cab-9d35-1fe315288a68" 00:19:02.672 ], 00:19:02.672 "product_name": "Malloc disk", 00:19:02.672 "block_size": 512, 00:19:02.672 "num_blocks": 65536, 00:19:02.672 "uuid": "0adef175-26c0-4cab-9d35-1fe315288a68", 00:19:02.672 "assigned_rate_limits": { 00:19:02.672 "rw_ios_per_sec": 0, 00:19:02.672 "rw_mbytes_per_sec": 0, 00:19:02.672 "r_mbytes_per_sec": 0, 00:19:02.672 "w_mbytes_per_sec": 0 00:19:02.672 }, 00:19:02.672 "claimed": true, 00:19:02.673 "claim_type": "exclusive_write", 00:19:02.673 "zoned": false, 00:19:02.673 "supported_io_types": { 00:19:02.673 "read": true, 00:19:02.673 "write": true, 00:19:02.673 "unmap": true, 00:19:02.673 "write_zeroes": true, 00:19:02.673 "flush": true, 00:19:02.673 "reset": true, 00:19:02.673 "compare": false, 00:19:02.673 "compare_and_write": false, 00:19:02.673 "abort": true, 00:19:02.673 "nvme_admin": false, 00:19:02.673 "nvme_io": false 00:19:02.673 }, 00:19:02.673 "memory_domains": [ 00:19:02.673 { 00:19:02.673 "dma_device_id": "system", 00:19:02.673 "dma_device_type": 1 00:19:02.673 }, 00:19:02.673 { 00:19:02.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.673 "dma_device_type": 2 00:19:02.673 } 00:19:02.673 ], 00:19:02.673 "driver_specific": {} 00:19:02.673 }' 00:19:02.673 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.673 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.933 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.242 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.242 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.242 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:03.242 10:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.242 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.242 "name": "BaseBdev3", 00:19:03.242 "aliases": [ 00:19:03.242 "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82" 00:19:03.242 ], 00:19:03.242 "product_name": "Malloc disk", 00:19:03.242 "block_size": 512, 00:19:03.242 "num_blocks": 65536, 00:19:03.242 "uuid": "e3dc54f0-fbd5-46cf-a0a5-5f7d04a52c82", 00:19:03.242 "assigned_rate_limits": { 00:19:03.242 "rw_ios_per_sec": 0, 00:19:03.242 "rw_mbytes_per_sec": 0, 00:19:03.242 "r_mbytes_per_sec": 0, 00:19:03.242 "w_mbytes_per_sec": 0 00:19:03.242 }, 00:19:03.242 "claimed": true, 00:19:03.242 "claim_type": "exclusive_write", 00:19:03.242 "zoned": false, 00:19:03.242 "supported_io_types": { 00:19:03.242 "read": true, 00:19:03.242 "write": true, 00:19:03.242 "unmap": true, 00:19:03.242 "write_zeroes": true, 00:19:03.242 "flush": true, 00:19:03.242 "reset": true, 00:19:03.242 "compare": false, 00:19:03.242 "compare_and_write": false, 00:19:03.242 "abort": true, 00:19:03.242 "nvme_admin": false, 00:19:03.242 "nvme_io": false 00:19:03.242 }, 00:19:03.242 "memory_domains": [ 00:19:03.242 { 00:19:03.242 "dma_device_id": "system", 00:19:03.242 "dma_device_type": 1 00:19:03.242 }, 00:19:03.242 { 00:19:03.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.242 "dma_device_type": 2 00:19:03.242 } 00:19:03.242 ], 00:19:03.242 "driver_specific": {} 00:19:03.242 }' 00:19:03.242 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.242 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:03.524 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.783 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.783 "name": "BaseBdev4", 00:19:03.783 "aliases": [ 00:19:03.783 "5002d29f-1daa-47fb-ac72-3d8727938256" 00:19:03.783 ], 00:19:03.783 "product_name": "Malloc disk", 00:19:03.783 "block_size": 512, 00:19:03.783 "num_blocks": 65536, 00:19:03.783 "uuid": "5002d29f-1daa-47fb-ac72-3d8727938256", 00:19:03.783 "assigned_rate_limits": { 00:19:03.783 "rw_ios_per_sec": 0, 00:19:03.783 "rw_mbytes_per_sec": 0, 00:19:03.783 "r_mbytes_per_sec": 0, 00:19:03.783 "w_mbytes_per_sec": 0 00:19:03.783 }, 00:19:03.783 "claimed": true, 00:19:03.783 "claim_type": "exclusive_write", 00:19:03.783 "zoned": false, 00:19:03.783 "supported_io_types": { 00:19:03.783 "read": true, 00:19:03.783 "write": true, 00:19:03.783 "unmap": true, 00:19:03.783 "write_zeroes": true, 00:19:03.783 "flush": true, 00:19:03.783 "reset": true, 00:19:03.783 "compare": false, 00:19:03.783 "compare_and_write": false, 00:19:03.783 "abort": true, 00:19:03.783 "nvme_admin": false, 00:19:03.783 "nvme_io": false 00:19:03.783 }, 00:19:03.783 "memory_domains": [ 00:19:03.783 { 00:19:03.783 "dma_device_id": "system", 00:19:03.783 "dma_device_type": 1 00:19:03.783 }, 00:19:03.783 { 00:19:03.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.783 "dma_device_type": 2 00:19:03.783 } 00:19:03.783 ], 00:19:03.783 "driver_specific": {} 00:19:03.783 }' 00:19:03.783 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.783 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.783 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.783 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:04.043 10:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:04.303 [2024-06-10 10:14:26.068695] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:04.303 [2024-06-10 10:14:26.068712] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.303 [2024-06-10 10:14:26.068753] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.303 [2024-06-10 10:14:26.068963] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:04.303 [2024-06-10 10:14:26.068970] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa4f30 name Existed_Raid, state offline 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1052428 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1052428 ']' 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1052428 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1052428 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1052428' 00:19:04.303 killing process with pid 1052428 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1052428 00:19:04.303 [2024-06-10 10:14:26.137105] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:04.303 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1052428 00:19:04.303 [2024-06-10 10:14:26.157176] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:04.564 00:19:04.564 real 0m26.413s 00:19:04.564 user 0m49.571s 00:19:04.564 sys 0m3.817s 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.564 ************************************ 00:19:04.564 END TEST raid_state_function_test 00:19:04.564 ************************************ 00:19:04.564 10:14:26 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:19:04.564 10:14:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:04.564 10:14:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:04.564 10:14:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:04.564 ************************************ 00:19:04.564 START TEST raid_state_function_test_sb 00:19:04.564 ************************************ 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 true 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1057447 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1057447' 00:19:04.564 Process raid pid: 1057447 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1057447 /var/tmp/spdk-raid.sock 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1057447 ']' 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:04.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:04.564 10:14:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:04.564 [2024-06-10 10:14:26.414999] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:19:04.564 [2024-06-10 10:14:26.415046] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:04.825 [2024-06-10 10:14:26.504279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:04.825 [2024-06-10 10:14:26.567953] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:04.825 [2024-06-10 10:14:26.607526] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:04.825 [2024-06-10 10:14:26.607547] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:05.765 [2024-06-10 10:14:27.438495] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:05.765 [2024-06-10 10:14:27.438528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:05.765 [2024-06-10 10:14:27.438534] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:05.765 [2024-06-10 10:14:27.438540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:05.765 [2024-06-10 10:14:27.438547] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:05.765 [2024-06-10 10:14:27.438552] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:05.765 [2024-06-10 10:14:27.438557] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:05.765 [2024-06-10 10:14:27.438562] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.765 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.025 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.025 "name": "Existed_Raid", 00:19:06.025 "uuid": "6e08d563-a2fb-4fa7-8865-3f768ceee229", 00:19:06.025 "strip_size_kb": 0, 00:19:06.025 "state": "configuring", 00:19:06.025 "raid_level": "raid1", 00:19:06.025 "superblock": true, 00:19:06.025 "num_base_bdevs": 4, 00:19:06.025 "num_base_bdevs_discovered": 0, 00:19:06.025 "num_base_bdevs_operational": 4, 00:19:06.025 "base_bdevs_list": [ 00:19:06.025 { 00:19:06.025 "name": "BaseBdev1", 00:19:06.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.025 "is_configured": false, 00:19:06.025 "data_offset": 0, 00:19:06.025 "data_size": 0 00:19:06.025 }, 00:19:06.025 { 00:19:06.025 "name": "BaseBdev2", 00:19:06.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.025 "is_configured": false, 00:19:06.025 "data_offset": 0, 00:19:06.025 "data_size": 0 00:19:06.025 }, 00:19:06.025 { 00:19:06.025 "name": "BaseBdev3", 00:19:06.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.025 "is_configured": false, 00:19:06.025 "data_offset": 0, 00:19:06.025 "data_size": 0 00:19:06.025 }, 00:19:06.025 { 00:19:06.025 "name": "BaseBdev4", 00:19:06.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.025 "is_configured": false, 00:19:06.025 "data_offset": 0, 00:19:06.025 "data_size": 0 00:19:06.025 } 00:19:06.025 ] 00:19:06.025 }' 00:19:06.025 10:14:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.025 10:14:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:06.594 10:14:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:06.594 [2024-06-10 10:14:28.392792] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:06.594 [2024-06-10 10:14:28.392810] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2556b20 name Existed_Raid, state configuring 00:19:06.594 10:14:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:06.854 [2024-06-10 10:14:28.577277] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:06.854 [2024-06-10 10:14:28.577293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:06.854 [2024-06-10 10:14:28.577298] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:06.854 [2024-06-10 10:14:28.577303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:06.854 [2024-06-10 10:14:28.577308] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:06.854 [2024-06-10 10:14:28.577313] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:06.854 [2024-06-10 10:14:28.577318] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:06.854 [2024-06-10 10:14:28.577323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:06.854 10:14:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:07.114 [2024-06-10 10:14:28.776341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:07.114 BaseBdev1 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:07.114 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:07.374 10:14:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:07.374 [ 00:19:07.374 { 00:19:07.374 "name": "BaseBdev1", 00:19:07.374 "aliases": [ 00:19:07.374 "52bceca8-e9f8-400d-bf44-2613df6713db" 00:19:07.374 ], 00:19:07.374 "product_name": "Malloc disk", 00:19:07.374 "block_size": 512, 00:19:07.374 "num_blocks": 65536, 00:19:07.374 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:07.374 "assigned_rate_limits": { 00:19:07.374 "rw_ios_per_sec": 0, 00:19:07.374 "rw_mbytes_per_sec": 0, 00:19:07.374 "r_mbytes_per_sec": 0, 00:19:07.374 "w_mbytes_per_sec": 0 00:19:07.374 }, 00:19:07.374 "claimed": true, 00:19:07.374 "claim_type": "exclusive_write", 00:19:07.374 "zoned": false, 00:19:07.374 "supported_io_types": { 00:19:07.374 "read": true, 00:19:07.374 "write": true, 00:19:07.374 "unmap": true, 00:19:07.374 "write_zeroes": true, 00:19:07.374 "flush": true, 00:19:07.374 "reset": true, 00:19:07.374 "compare": false, 00:19:07.374 "compare_and_write": false, 00:19:07.374 "abort": true, 00:19:07.374 "nvme_admin": false, 00:19:07.374 "nvme_io": false 00:19:07.374 }, 00:19:07.374 "memory_domains": [ 00:19:07.374 { 00:19:07.374 "dma_device_id": "system", 00:19:07.374 "dma_device_type": 1 00:19:07.374 }, 00:19:07.374 { 00:19:07.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.374 "dma_device_type": 2 00:19:07.374 } 00:19:07.374 ], 00:19:07.374 "driver_specific": {} 00:19:07.374 } 00:19:07.374 ] 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.374 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.375 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.375 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.634 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.634 "name": "Existed_Raid", 00:19:07.634 "uuid": "65ed226c-f974-4874-b9a7-a87b6b14dd08", 00:19:07.635 "strip_size_kb": 0, 00:19:07.635 "state": "configuring", 00:19:07.635 "raid_level": "raid1", 00:19:07.635 "superblock": true, 00:19:07.635 "num_base_bdevs": 4, 00:19:07.635 "num_base_bdevs_discovered": 1, 00:19:07.635 "num_base_bdevs_operational": 4, 00:19:07.635 "base_bdevs_list": [ 00:19:07.635 { 00:19:07.635 "name": "BaseBdev1", 00:19:07.635 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:07.635 "is_configured": true, 00:19:07.635 "data_offset": 2048, 00:19:07.635 "data_size": 63488 00:19:07.635 }, 00:19:07.635 { 00:19:07.635 "name": "BaseBdev2", 00:19:07.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.635 "is_configured": false, 00:19:07.635 "data_offset": 0, 00:19:07.635 "data_size": 0 00:19:07.635 }, 00:19:07.635 { 00:19:07.635 "name": "BaseBdev3", 00:19:07.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.635 "is_configured": false, 00:19:07.635 "data_offset": 0, 00:19:07.635 "data_size": 0 00:19:07.635 }, 00:19:07.635 { 00:19:07.635 "name": "BaseBdev4", 00:19:07.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.635 "is_configured": false, 00:19:07.635 "data_offset": 0, 00:19:07.635 "data_size": 0 00:19:07.635 } 00:19:07.635 ] 00:19:07.635 }' 00:19:07.635 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.635 10:14:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:08.251 10:14:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:08.251 [2024-06-10 10:14:30.063587] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:08.251 [2024-06-10 10:14:30.063619] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25563b0 name Existed_Raid, state configuring 00:19:08.251 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:08.511 [2024-06-10 10:14:30.252097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:08.511 [2024-06-10 10:14:30.253246] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:08.511 [2024-06-10 10:14:30.253270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:08.511 [2024-06-10 10:14:30.253276] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:08.511 [2024-06-10 10:14:30.253282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:08.512 [2024-06-10 10:14:30.253287] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:08.512 [2024-06-10 10:14:30.253292] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.512 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.772 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.772 "name": "Existed_Raid", 00:19:08.772 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:08.772 "strip_size_kb": 0, 00:19:08.772 "state": "configuring", 00:19:08.772 "raid_level": "raid1", 00:19:08.772 "superblock": true, 00:19:08.772 "num_base_bdevs": 4, 00:19:08.772 "num_base_bdevs_discovered": 1, 00:19:08.772 "num_base_bdevs_operational": 4, 00:19:08.772 "base_bdevs_list": [ 00:19:08.772 { 00:19:08.772 "name": "BaseBdev1", 00:19:08.772 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:08.772 "is_configured": true, 00:19:08.772 "data_offset": 2048, 00:19:08.772 "data_size": 63488 00:19:08.772 }, 00:19:08.772 { 00:19:08.772 "name": "BaseBdev2", 00:19:08.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.772 "is_configured": false, 00:19:08.772 "data_offset": 0, 00:19:08.772 "data_size": 0 00:19:08.772 }, 00:19:08.772 { 00:19:08.772 "name": "BaseBdev3", 00:19:08.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.772 "is_configured": false, 00:19:08.772 "data_offset": 0, 00:19:08.772 "data_size": 0 00:19:08.772 }, 00:19:08.772 { 00:19:08.772 "name": "BaseBdev4", 00:19:08.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.772 "is_configured": false, 00:19:08.772 "data_offset": 0, 00:19:08.772 "data_size": 0 00:19:08.772 } 00:19:08.772 ] 00:19:08.772 }' 00:19:08.772 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.772 10:14:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:09.341 10:14:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:09.342 [2024-06-10 10:14:31.147282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:09.342 BaseBdev2 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:09.342 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:09.602 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:09.862 [ 00:19:09.862 { 00:19:09.862 "name": "BaseBdev2", 00:19:09.862 "aliases": [ 00:19:09.862 "6268c92d-bdc6-4f34-8300-e06eccf9ec0a" 00:19:09.862 ], 00:19:09.862 "product_name": "Malloc disk", 00:19:09.862 "block_size": 512, 00:19:09.862 "num_blocks": 65536, 00:19:09.862 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:09.862 "assigned_rate_limits": { 00:19:09.862 "rw_ios_per_sec": 0, 00:19:09.862 "rw_mbytes_per_sec": 0, 00:19:09.862 "r_mbytes_per_sec": 0, 00:19:09.862 "w_mbytes_per_sec": 0 00:19:09.862 }, 00:19:09.862 "claimed": true, 00:19:09.862 "claim_type": "exclusive_write", 00:19:09.862 "zoned": false, 00:19:09.862 "supported_io_types": { 00:19:09.862 "read": true, 00:19:09.862 "write": true, 00:19:09.862 "unmap": true, 00:19:09.862 "write_zeroes": true, 00:19:09.862 "flush": true, 00:19:09.862 "reset": true, 00:19:09.862 "compare": false, 00:19:09.862 "compare_and_write": false, 00:19:09.862 "abort": true, 00:19:09.862 "nvme_admin": false, 00:19:09.862 "nvme_io": false 00:19:09.862 }, 00:19:09.862 "memory_domains": [ 00:19:09.862 { 00:19:09.862 "dma_device_id": "system", 00:19:09.862 "dma_device_type": 1 00:19:09.862 }, 00:19:09.862 { 00:19:09.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.862 "dma_device_type": 2 00:19:09.862 } 00:19:09.862 ], 00:19:09.862 "driver_specific": {} 00:19:09.862 } 00:19:09.862 ] 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.862 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.122 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.122 "name": "Existed_Raid", 00:19:10.122 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:10.122 "strip_size_kb": 0, 00:19:10.122 "state": "configuring", 00:19:10.122 "raid_level": "raid1", 00:19:10.122 "superblock": true, 00:19:10.122 "num_base_bdevs": 4, 00:19:10.122 "num_base_bdevs_discovered": 2, 00:19:10.122 "num_base_bdevs_operational": 4, 00:19:10.122 "base_bdevs_list": [ 00:19:10.122 { 00:19:10.122 "name": "BaseBdev1", 00:19:10.122 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:10.122 "is_configured": true, 00:19:10.122 "data_offset": 2048, 00:19:10.122 "data_size": 63488 00:19:10.122 }, 00:19:10.122 { 00:19:10.122 "name": "BaseBdev2", 00:19:10.122 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:10.122 "is_configured": true, 00:19:10.122 "data_offset": 2048, 00:19:10.122 "data_size": 63488 00:19:10.122 }, 00:19:10.122 { 00:19:10.122 "name": "BaseBdev3", 00:19:10.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.122 "is_configured": false, 00:19:10.122 "data_offset": 0, 00:19:10.122 "data_size": 0 00:19:10.122 }, 00:19:10.122 { 00:19:10.122 "name": "BaseBdev4", 00:19:10.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.122 "is_configured": false, 00:19:10.122 "data_offset": 0, 00:19:10.122 "data_size": 0 00:19:10.122 } 00:19:10.122 ] 00:19:10.122 }' 00:19:10.122 10:14:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.122 10:14:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:10.691 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:10.692 [2024-06-10 10:14:32.435513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:10.692 BaseBdev3 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:10.692 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:10.951 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:10.951 [ 00:19:10.951 { 00:19:10.951 "name": "BaseBdev3", 00:19:10.951 "aliases": [ 00:19:10.951 "e33fba5d-43b1-4ea0-8c18-5b061d112527" 00:19:10.951 ], 00:19:10.951 "product_name": "Malloc disk", 00:19:10.951 "block_size": 512, 00:19:10.951 "num_blocks": 65536, 00:19:10.951 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:10.951 "assigned_rate_limits": { 00:19:10.951 "rw_ios_per_sec": 0, 00:19:10.951 "rw_mbytes_per_sec": 0, 00:19:10.951 "r_mbytes_per_sec": 0, 00:19:10.951 "w_mbytes_per_sec": 0 00:19:10.951 }, 00:19:10.951 "claimed": true, 00:19:10.951 "claim_type": "exclusive_write", 00:19:10.951 "zoned": false, 00:19:10.951 "supported_io_types": { 00:19:10.951 "read": true, 00:19:10.951 "write": true, 00:19:10.951 "unmap": true, 00:19:10.951 "write_zeroes": true, 00:19:10.951 "flush": true, 00:19:10.951 "reset": true, 00:19:10.951 "compare": false, 00:19:10.951 "compare_and_write": false, 00:19:10.951 "abort": true, 00:19:10.951 "nvme_admin": false, 00:19:10.951 "nvme_io": false 00:19:10.951 }, 00:19:10.951 "memory_domains": [ 00:19:10.951 { 00:19:10.951 "dma_device_id": "system", 00:19:10.951 "dma_device_type": 1 00:19:10.951 }, 00:19:10.951 { 00:19:10.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.951 "dma_device_type": 2 00:19:10.951 } 00:19:10.951 ], 00:19:10.951 "driver_specific": {} 00:19:10.951 } 00:19:10.951 ] 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.212 10:14:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.212 10:14:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.212 "name": "Existed_Raid", 00:19:11.212 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:11.212 "strip_size_kb": 0, 00:19:11.212 "state": "configuring", 00:19:11.212 "raid_level": "raid1", 00:19:11.212 "superblock": true, 00:19:11.212 "num_base_bdevs": 4, 00:19:11.212 "num_base_bdevs_discovered": 3, 00:19:11.212 "num_base_bdevs_operational": 4, 00:19:11.212 "base_bdevs_list": [ 00:19:11.212 { 00:19:11.212 "name": "BaseBdev1", 00:19:11.212 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:11.212 "is_configured": true, 00:19:11.212 "data_offset": 2048, 00:19:11.212 "data_size": 63488 00:19:11.212 }, 00:19:11.212 { 00:19:11.212 "name": "BaseBdev2", 00:19:11.212 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:11.212 "is_configured": true, 00:19:11.212 "data_offset": 2048, 00:19:11.212 "data_size": 63488 00:19:11.212 }, 00:19:11.212 { 00:19:11.212 "name": "BaseBdev3", 00:19:11.212 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:11.212 "is_configured": true, 00:19:11.212 "data_offset": 2048, 00:19:11.212 "data_size": 63488 00:19:11.212 }, 00:19:11.212 { 00:19:11.212 "name": "BaseBdev4", 00:19:11.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.212 "is_configured": false, 00:19:11.212 "data_offset": 0, 00:19:11.212 "data_size": 0 00:19:11.212 } 00:19:11.212 ] 00:19:11.212 }' 00:19:11.212 10:14:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.212 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:11.782 10:14:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:12.042 [2024-06-10 10:14:33.719717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:12.042 [2024-06-10 10:14:33.719848] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25574c0 00:19:12.042 [2024-06-10 10:14:33.719857] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:12.042 [2024-06-10 10:14:33.719998] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2708820 00:19:12.042 [2024-06-10 10:14:33.720092] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25574c0 00:19:12.042 [2024-06-10 10:14:33.720098] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25574c0 00:19:12.042 [2024-06-10 10:14:33.720166] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:12.042 BaseBdev4 00:19:12.042 10:14:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:12.042 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:19:12.042 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:12.043 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:12.043 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:12.043 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:12.043 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.303 10:14:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:12.303 [ 00:19:12.303 { 00:19:12.303 "name": "BaseBdev4", 00:19:12.303 "aliases": [ 00:19:12.303 "52e76ff5-3201-4652-b123-b624ba5a0f4b" 00:19:12.303 ], 00:19:12.303 "product_name": "Malloc disk", 00:19:12.303 "block_size": 512, 00:19:12.303 "num_blocks": 65536, 00:19:12.303 "uuid": "52e76ff5-3201-4652-b123-b624ba5a0f4b", 00:19:12.303 "assigned_rate_limits": { 00:19:12.303 "rw_ios_per_sec": 0, 00:19:12.303 "rw_mbytes_per_sec": 0, 00:19:12.303 "r_mbytes_per_sec": 0, 00:19:12.303 "w_mbytes_per_sec": 0 00:19:12.303 }, 00:19:12.303 "claimed": true, 00:19:12.303 "claim_type": "exclusive_write", 00:19:12.303 "zoned": false, 00:19:12.303 "supported_io_types": { 00:19:12.303 "read": true, 00:19:12.303 "write": true, 00:19:12.303 "unmap": true, 00:19:12.303 "write_zeroes": true, 00:19:12.303 "flush": true, 00:19:12.303 "reset": true, 00:19:12.303 "compare": false, 00:19:12.303 "compare_and_write": false, 00:19:12.303 "abort": true, 00:19:12.303 "nvme_admin": false, 00:19:12.303 "nvme_io": false 00:19:12.303 }, 00:19:12.303 "memory_domains": [ 00:19:12.303 { 00:19:12.303 "dma_device_id": "system", 00:19:12.303 "dma_device_type": 1 00:19:12.303 }, 00:19:12.303 { 00:19:12.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.303 "dma_device_type": 2 00:19:12.303 } 00:19:12.303 ], 00:19:12.303 "driver_specific": {} 00:19:12.303 } 00:19:12.303 ] 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.303 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.563 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.563 "name": "Existed_Raid", 00:19:12.563 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:12.563 "strip_size_kb": 0, 00:19:12.563 "state": "online", 00:19:12.563 "raid_level": "raid1", 00:19:12.563 "superblock": true, 00:19:12.563 "num_base_bdevs": 4, 00:19:12.563 "num_base_bdevs_discovered": 4, 00:19:12.563 "num_base_bdevs_operational": 4, 00:19:12.563 "base_bdevs_list": [ 00:19:12.563 { 00:19:12.563 "name": "BaseBdev1", 00:19:12.563 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:12.563 "is_configured": true, 00:19:12.563 "data_offset": 2048, 00:19:12.563 "data_size": 63488 00:19:12.563 }, 00:19:12.563 { 00:19:12.563 "name": "BaseBdev2", 00:19:12.563 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:12.563 "is_configured": true, 00:19:12.563 "data_offset": 2048, 00:19:12.563 "data_size": 63488 00:19:12.563 }, 00:19:12.563 { 00:19:12.563 "name": "BaseBdev3", 00:19:12.563 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:12.563 "is_configured": true, 00:19:12.563 "data_offset": 2048, 00:19:12.563 "data_size": 63488 00:19:12.563 }, 00:19:12.563 { 00:19:12.563 "name": "BaseBdev4", 00:19:12.563 "uuid": "52e76ff5-3201-4652-b123-b624ba5a0f4b", 00:19:12.563 "is_configured": true, 00:19:12.563 "data_offset": 2048, 00:19:12.563 "data_size": 63488 00:19:12.563 } 00:19:12.563 ] 00:19:12.563 }' 00:19:12.563 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.563 10:14:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:13.133 10:14:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:13.393 [2024-06-10 10:14:35.031292] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:13.393 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:13.393 "name": "Existed_Raid", 00:19:13.393 "aliases": [ 00:19:13.393 "e2d65745-7486-4843-ae29-f3232ba9793b" 00:19:13.393 ], 00:19:13.393 "product_name": "Raid Volume", 00:19:13.393 "block_size": 512, 00:19:13.393 "num_blocks": 63488, 00:19:13.393 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:13.393 "assigned_rate_limits": { 00:19:13.393 "rw_ios_per_sec": 0, 00:19:13.393 "rw_mbytes_per_sec": 0, 00:19:13.393 "r_mbytes_per_sec": 0, 00:19:13.393 "w_mbytes_per_sec": 0 00:19:13.393 }, 00:19:13.393 "claimed": false, 00:19:13.393 "zoned": false, 00:19:13.393 "supported_io_types": { 00:19:13.393 "read": true, 00:19:13.393 "write": true, 00:19:13.393 "unmap": false, 00:19:13.393 "write_zeroes": true, 00:19:13.393 "flush": false, 00:19:13.393 "reset": true, 00:19:13.394 "compare": false, 00:19:13.394 "compare_and_write": false, 00:19:13.394 "abort": false, 00:19:13.394 "nvme_admin": false, 00:19:13.394 "nvme_io": false 00:19:13.394 }, 00:19:13.394 "memory_domains": [ 00:19:13.394 { 00:19:13.394 "dma_device_id": "system", 00:19:13.394 "dma_device_type": 1 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.394 "dma_device_type": 2 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "system", 00:19:13.394 "dma_device_type": 1 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.394 "dma_device_type": 2 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "system", 00:19:13.394 "dma_device_type": 1 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.394 "dma_device_type": 2 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "system", 00:19:13.394 "dma_device_type": 1 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.394 "dma_device_type": 2 00:19:13.394 } 00:19:13.394 ], 00:19:13.394 "driver_specific": { 00:19:13.394 "raid": { 00:19:13.394 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:13.394 "strip_size_kb": 0, 00:19:13.394 "state": "online", 00:19:13.394 "raid_level": "raid1", 00:19:13.394 "superblock": true, 00:19:13.394 "num_base_bdevs": 4, 00:19:13.394 "num_base_bdevs_discovered": 4, 00:19:13.394 "num_base_bdevs_operational": 4, 00:19:13.394 "base_bdevs_list": [ 00:19:13.394 { 00:19:13.394 "name": "BaseBdev1", 00:19:13.394 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:13.394 "is_configured": true, 00:19:13.394 "data_offset": 2048, 00:19:13.394 "data_size": 63488 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "name": "BaseBdev2", 00:19:13.394 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:13.394 "is_configured": true, 00:19:13.394 "data_offset": 2048, 00:19:13.394 "data_size": 63488 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "name": "BaseBdev3", 00:19:13.394 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:13.394 "is_configured": true, 00:19:13.394 "data_offset": 2048, 00:19:13.394 "data_size": 63488 00:19:13.394 }, 00:19:13.394 { 00:19:13.394 "name": "BaseBdev4", 00:19:13.394 "uuid": "52e76ff5-3201-4652-b123-b624ba5a0f4b", 00:19:13.394 "is_configured": true, 00:19:13.394 "data_offset": 2048, 00:19:13.394 "data_size": 63488 00:19:13.394 } 00:19:13.394 ] 00:19:13.394 } 00:19:13.394 } 00:19:13.394 }' 00:19:13.394 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:13.394 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:13.394 BaseBdev2 00:19:13.394 BaseBdev3 00:19:13.394 BaseBdev4' 00:19:13.394 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.394 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:13.394 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.655 "name": "BaseBdev1", 00:19:13.655 "aliases": [ 00:19:13.655 "52bceca8-e9f8-400d-bf44-2613df6713db" 00:19:13.655 ], 00:19:13.655 "product_name": "Malloc disk", 00:19:13.655 "block_size": 512, 00:19:13.655 "num_blocks": 65536, 00:19:13.655 "uuid": "52bceca8-e9f8-400d-bf44-2613df6713db", 00:19:13.655 "assigned_rate_limits": { 00:19:13.655 "rw_ios_per_sec": 0, 00:19:13.655 "rw_mbytes_per_sec": 0, 00:19:13.655 "r_mbytes_per_sec": 0, 00:19:13.655 "w_mbytes_per_sec": 0 00:19:13.655 }, 00:19:13.655 "claimed": true, 00:19:13.655 "claim_type": "exclusive_write", 00:19:13.655 "zoned": false, 00:19:13.655 "supported_io_types": { 00:19:13.655 "read": true, 00:19:13.655 "write": true, 00:19:13.655 "unmap": true, 00:19:13.655 "write_zeroes": true, 00:19:13.655 "flush": true, 00:19:13.655 "reset": true, 00:19:13.655 "compare": false, 00:19:13.655 "compare_and_write": false, 00:19:13.655 "abort": true, 00:19:13.655 "nvme_admin": false, 00:19:13.655 "nvme_io": false 00:19:13.655 }, 00:19:13.655 "memory_domains": [ 00:19:13.655 { 00:19:13.655 "dma_device_id": "system", 00:19:13.655 "dma_device_type": 1 00:19:13.655 }, 00:19:13.655 { 00:19:13.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.655 "dma_device_type": 2 00:19:13.655 } 00:19:13.655 ], 00:19:13.655 "driver_specific": {} 00:19:13.655 }' 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.655 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:13.915 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:14.175 "name": "BaseBdev2", 00:19:14.175 "aliases": [ 00:19:14.175 "6268c92d-bdc6-4f34-8300-e06eccf9ec0a" 00:19:14.175 ], 00:19:14.175 "product_name": "Malloc disk", 00:19:14.175 "block_size": 512, 00:19:14.175 "num_blocks": 65536, 00:19:14.175 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:14.175 "assigned_rate_limits": { 00:19:14.175 "rw_ios_per_sec": 0, 00:19:14.175 "rw_mbytes_per_sec": 0, 00:19:14.175 "r_mbytes_per_sec": 0, 00:19:14.175 "w_mbytes_per_sec": 0 00:19:14.175 }, 00:19:14.175 "claimed": true, 00:19:14.175 "claim_type": "exclusive_write", 00:19:14.175 "zoned": false, 00:19:14.175 "supported_io_types": { 00:19:14.175 "read": true, 00:19:14.175 "write": true, 00:19:14.175 "unmap": true, 00:19:14.175 "write_zeroes": true, 00:19:14.175 "flush": true, 00:19:14.175 "reset": true, 00:19:14.175 "compare": false, 00:19:14.175 "compare_and_write": false, 00:19:14.175 "abort": true, 00:19:14.175 "nvme_admin": false, 00:19:14.175 "nvme_io": false 00:19:14.175 }, 00:19:14.175 "memory_domains": [ 00:19:14.175 { 00:19:14.175 "dma_device_id": "system", 00:19:14.175 "dma_device_type": 1 00:19:14.175 }, 00:19:14.175 { 00:19:14.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.175 "dma_device_type": 2 00:19:14.175 } 00:19:14.175 ], 00:19:14.175 "driver_specific": {} 00:19:14.175 }' 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:14.175 10:14:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.175 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:14.435 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:14.695 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:14.695 "name": "BaseBdev3", 00:19:14.695 "aliases": [ 00:19:14.695 "e33fba5d-43b1-4ea0-8c18-5b061d112527" 00:19:14.695 ], 00:19:14.695 "product_name": "Malloc disk", 00:19:14.695 "block_size": 512, 00:19:14.695 "num_blocks": 65536, 00:19:14.695 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:14.695 "assigned_rate_limits": { 00:19:14.695 "rw_ios_per_sec": 0, 00:19:14.695 "rw_mbytes_per_sec": 0, 00:19:14.695 "r_mbytes_per_sec": 0, 00:19:14.695 "w_mbytes_per_sec": 0 00:19:14.695 }, 00:19:14.695 "claimed": true, 00:19:14.695 "claim_type": "exclusive_write", 00:19:14.695 "zoned": false, 00:19:14.695 "supported_io_types": { 00:19:14.695 "read": true, 00:19:14.695 "write": true, 00:19:14.695 "unmap": true, 00:19:14.695 "write_zeroes": true, 00:19:14.695 "flush": true, 00:19:14.696 "reset": true, 00:19:14.696 "compare": false, 00:19:14.696 "compare_and_write": false, 00:19:14.696 "abort": true, 00:19:14.696 "nvme_admin": false, 00:19:14.696 "nvme_io": false 00:19:14.696 }, 00:19:14.696 "memory_domains": [ 00:19:14.696 { 00:19:14.696 "dma_device_id": "system", 00:19:14.696 "dma_device_type": 1 00:19:14.696 }, 00:19:14.696 { 00:19:14.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.696 "dma_device_type": 2 00:19:14.696 } 00:19:14.696 ], 00:19:14.696 "driver_specific": {} 00:19:14.696 }' 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.696 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:14.956 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:15.216 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:15.216 "name": "BaseBdev4", 00:19:15.216 "aliases": [ 00:19:15.216 "52e76ff5-3201-4652-b123-b624ba5a0f4b" 00:19:15.216 ], 00:19:15.216 "product_name": "Malloc disk", 00:19:15.216 "block_size": 512, 00:19:15.216 "num_blocks": 65536, 00:19:15.216 "uuid": "52e76ff5-3201-4652-b123-b624ba5a0f4b", 00:19:15.216 "assigned_rate_limits": { 00:19:15.216 "rw_ios_per_sec": 0, 00:19:15.216 "rw_mbytes_per_sec": 0, 00:19:15.216 "r_mbytes_per_sec": 0, 00:19:15.216 "w_mbytes_per_sec": 0 00:19:15.216 }, 00:19:15.216 "claimed": true, 00:19:15.216 "claim_type": "exclusive_write", 00:19:15.216 "zoned": false, 00:19:15.216 "supported_io_types": { 00:19:15.216 "read": true, 00:19:15.216 "write": true, 00:19:15.216 "unmap": true, 00:19:15.216 "write_zeroes": true, 00:19:15.216 "flush": true, 00:19:15.216 "reset": true, 00:19:15.216 "compare": false, 00:19:15.216 "compare_and_write": false, 00:19:15.216 "abort": true, 00:19:15.216 "nvme_admin": false, 00:19:15.216 "nvme_io": false 00:19:15.216 }, 00:19:15.216 "memory_domains": [ 00:19:15.217 { 00:19:15.217 "dma_device_id": "system", 00:19:15.217 "dma_device_type": 1 00:19:15.217 }, 00:19:15.217 { 00:19:15.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.217 "dma_device_type": 2 00:19:15.217 } 00:19:15.217 ], 00:19:15.217 "driver_specific": {} 00:19:15.217 }' 00:19:15.217 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:15.217 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:15.217 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:15.217 10:14:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:15.217 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:15.217 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:15.217 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:15.477 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:15.737 [2024-06-10 10:14:37.405107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.737 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.738 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.998 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.998 "name": "Existed_Raid", 00:19:15.998 "uuid": "e2d65745-7486-4843-ae29-f3232ba9793b", 00:19:15.998 "strip_size_kb": 0, 00:19:15.998 "state": "online", 00:19:15.998 "raid_level": "raid1", 00:19:15.998 "superblock": true, 00:19:15.998 "num_base_bdevs": 4, 00:19:15.998 "num_base_bdevs_discovered": 3, 00:19:15.998 "num_base_bdevs_operational": 3, 00:19:15.998 "base_bdevs_list": [ 00:19:15.998 { 00:19:15.998 "name": null, 00:19:15.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.998 "is_configured": false, 00:19:15.998 "data_offset": 2048, 00:19:15.998 "data_size": 63488 00:19:15.998 }, 00:19:15.998 { 00:19:15.998 "name": "BaseBdev2", 00:19:15.998 "uuid": "6268c92d-bdc6-4f34-8300-e06eccf9ec0a", 00:19:15.998 "is_configured": true, 00:19:15.998 "data_offset": 2048, 00:19:15.998 "data_size": 63488 00:19:15.998 }, 00:19:15.998 { 00:19:15.998 "name": "BaseBdev3", 00:19:15.998 "uuid": "e33fba5d-43b1-4ea0-8c18-5b061d112527", 00:19:15.998 "is_configured": true, 00:19:15.998 "data_offset": 2048, 00:19:15.998 "data_size": 63488 00:19:15.998 }, 00:19:15.998 { 00:19:15.998 "name": "BaseBdev4", 00:19:15.998 "uuid": "52e76ff5-3201-4652-b123-b624ba5a0f4b", 00:19:15.998 "is_configured": true, 00:19:15.998 "data_offset": 2048, 00:19:15.998 "data_size": 63488 00:19:15.998 } 00:19:15.998 ] 00:19:15.998 }' 00:19:15.998 10:14:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.998 10:14:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:16.569 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:16.830 [2024-06-10 10:14:38.479840] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:16.830 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:17.091 [2024-06-10 10:14:38.862634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:17.091 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:17.091 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:17.091 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.091 10:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:17.352 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:17.352 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:17.352 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:17.613 [2024-06-10 10:14:39.249439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:17.613 [2024-06-10 10:14:39.249504] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:17.613 [2024-06-10 10:14:39.255540] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:17.613 [2024-06-10 10:14:39.255564] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:17.613 [2024-06-10 10:14:39.255570] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25574c0 name Existed_Raid, state offline 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:17.613 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:17.874 BaseBdev2 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:17.874 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.135 10:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:18.396 [ 00:19:18.396 { 00:19:18.396 "name": "BaseBdev2", 00:19:18.396 "aliases": [ 00:19:18.396 "c1063e0c-f833-4365-8f69-4ce319a12052" 00:19:18.396 ], 00:19:18.396 "product_name": "Malloc disk", 00:19:18.396 "block_size": 512, 00:19:18.396 "num_blocks": 65536, 00:19:18.396 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:18.396 "assigned_rate_limits": { 00:19:18.396 "rw_ios_per_sec": 0, 00:19:18.396 "rw_mbytes_per_sec": 0, 00:19:18.396 "r_mbytes_per_sec": 0, 00:19:18.396 "w_mbytes_per_sec": 0 00:19:18.396 }, 00:19:18.396 "claimed": false, 00:19:18.396 "zoned": false, 00:19:18.396 "supported_io_types": { 00:19:18.396 "read": true, 00:19:18.396 "write": true, 00:19:18.396 "unmap": true, 00:19:18.396 "write_zeroes": true, 00:19:18.396 "flush": true, 00:19:18.396 "reset": true, 00:19:18.396 "compare": false, 00:19:18.396 "compare_and_write": false, 00:19:18.396 "abort": true, 00:19:18.396 "nvme_admin": false, 00:19:18.396 "nvme_io": false 00:19:18.396 }, 00:19:18.396 "memory_domains": [ 00:19:18.396 { 00:19:18.396 "dma_device_id": "system", 00:19:18.396 "dma_device_type": 1 00:19:18.396 }, 00:19:18.396 { 00:19:18.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.396 "dma_device_type": 2 00:19:18.396 } 00:19:18.396 ], 00:19:18.396 "driver_specific": {} 00:19:18.396 } 00:19:18.396 ] 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:18.396 BaseBdev3 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:18.396 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.657 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:18.917 [ 00:19:18.917 { 00:19:18.917 "name": "BaseBdev3", 00:19:18.917 "aliases": [ 00:19:18.917 "af1ea89b-374a-4621-bb79-dc65428661b8" 00:19:18.917 ], 00:19:18.917 "product_name": "Malloc disk", 00:19:18.917 "block_size": 512, 00:19:18.917 "num_blocks": 65536, 00:19:18.917 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:18.917 "assigned_rate_limits": { 00:19:18.917 "rw_ios_per_sec": 0, 00:19:18.917 "rw_mbytes_per_sec": 0, 00:19:18.917 "r_mbytes_per_sec": 0, 00:19:18.917 "w_mbytes_per_sec": 0 00:19:18.917 }, 00:19:18.917 "claimed": false, 00:19:18.917 "zoned": false, 00:19:18.917 "supported_io_types": { 00:19:18.917 "read": true, 00:19:18.917 "write": true, 00:19:18.917 "unmap": true, 00:19:18.917 "write_zeroes": true, 00:19:18.917 "flush": true, 00:19:18.917 "reset": true, 00:19:18.917 "compare": false, 00:19:18.917 "compare_and_write": false, 00:19:18.917 "abort": true, 00:19:18.917 "nvme_admin": false, 00:19:18.917 "nvme_io": false 00:19:18.917 }, 00:19:18.917 "memory_domains": [ 00:19:18.917 { 00:19:18.917 "dma_device_id": "system", 00:19:18.917 "dma_device_type": 1 00:19:18.917 }, 00:19:18.917 { 00:19:18.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.917 "dma_device_type": 2 00:19:18.917 } 00:19:18.917 ], 00:19:18.917 "driver_specific": {} 00:19:18.917 } 00:19:18.917 ] 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:18.917 BaseBdev4 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:18.917 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.178 10:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:19.439 [ 00:19:19.439 { 00:19:19.439 "name": "BaseBdev4", 00:19:19.439 "aliases": [ 00:19:19.439 "d9243707-a64f-4c88-8763-195e82af77cf" 00:19:19.439 ], 00:19:19.439 "product_name": "Malloc disk", 00:19:19.439 "block_size": 512, 00:19:19.439 "num_blocks": 65536, 00:19:19.439 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:19.439 "assigned_rate_limits": { 00:19:19.439 "rw_ios_per_sec": 0, 00:19:19.439 "rw_mbytes_per_sec": 0, 00:19:19.439 "r_mbytes_per_sec": 0, 00:19:19.439 "w_mbytes_per_sec": 0 00:19:19.439 }, 00:19:19.439 "claimed": false, 00:19:19.439 "zoned": false, 00:19:19.439 "supported_io_types": { 00:19:19.439 "read": true, 00:19:19.439 "write": true, 00:19:19.439 "unmap": true, 00:19:19.439 "write_zeroes": true, 00:19:19.439 "flush": true, 00:19:19.439 "reset": true, 00:19:19.439 "compare": false, 00:19:19.439 "compare_and_write": false, 00:19:19.439 "abort": true, 00:19:19.439 "nvme_admin": false, 00:19:19.439 "nvme_io": false 00:19:19.439 }, 00:19:19.439 "memory_domains": [ 00:19:19.439 { 00:19:19.439 "dma_device_id": "system", 00:19:19.439 "dma_device_type": 1 00:19:19.439 }, 00:19:19.439 { 00:19:19.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.439 "dma_device_type": 2 00:19:19.439 } 00:19:19.439 ], 00:19:19.439 "driver_specific": {} 00:19:19.439 } 00:19:19.439 ] 00:19:19.439 10:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:19.439 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:19.439 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:19.439 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:19.700 [2024-06-10 10:14:41.316451] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:19.701 [2024-06-10 10:14:41.316482] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:19.701 [2024-06-10 10:14:41.316496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:19.701 [2024-06-10 10:14:41.317535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.701 [2024-06-10 10:14:41.317566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.701 "name": "Existed_Raid", 00:19:19.701 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:19.701 "strip_size_kb": 0, 00:19:19.701 "state": "configuring", 00:19:19.701 "raid_level": "raid1", 00:19:19.701 "superblock": true, 00:19:19.701 "num_base_bdevs": 4, 00:19:19.701 "num_base_bdevs_discovered": 3, 00:19:19.701 "num_base_bdevs_operational": 4, 00:19:19.701 "base_bdevs_list": [ 00:19:19.701 { 00:19:19.701 "name": "BaseBdev1", 00:19:19.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.701 "is_configured": false, 00:19:19.701 "data_offset": 0, 00:19:19.701 "data_size": 0 00:19:19.701 }, 00:19:19.701 { 00:19:19.701 "name": "BaseBdev2", 00:19:19.701 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:19.701 "is_configured": true, 00:19:19.701 "data_offset": 2048, 00:19:19.701 "data_size": 63488 00:19:19.701 }, 00:19:19.701 { 00:19:19.701 "name": "BaseBdev3", 00:19:19.701 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:19.701 "is_configured": true, 00:19:19.701 "data_offset": 2048, 00:19:19.701 "data_size": 63488 00:19:19.701 }, 00:19:19.701 { 00:19:19.701 "name": "BaseBdev4", 00:19:19.701 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:19.701 "is_configured": true, 00:19:19.701 "data_offset": 2048, 00:19:19.701 "data_size": 63488 00:19:19.701 } 00:19:19.701 ] 00:19:19.701 }' 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.701 10:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.273 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:20.533 [2024-06-10 10:14:42.226740] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:20.533 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:20.533 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.533 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.533 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:20.533 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.534 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.794 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.794 "name": "Existed_Raid", 00:19:20.794 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:20.794 "strip_size_kb": 0, 00:19:20.794 "state": "configuring", 00:19:20.794 "raid_level": "raid1", 00:19:20.794 "superblock": true, 00:19:20.794 "num_base_bdevs": 4, 00:19:20.794 "num_base_bdevs_discovered": 2, 00:19:20.794 "num_base_bdevs_operational": 4, 00:19:20.794 "base_bdevs_list": [ 00:19:20.794 { 00:19:20.794 "name": "BaseBdev1", 00:19:20.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.794 "is_configured": false, 00:19:20.794 "data_offset": 0, 00:19:20.794 "data_size": 0 00:19:20.794 }, 00:19:20.794 { 00:19:20.794 "name": null, 00:19:20.794 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:20.794 "is_configured": false, 00:19:20.794 "data_offset": 2048, 00:19:20.794 "data_size": 63488 00:19:20.794 }, 00:19:20.794 { 00:19:20.794 "name": "BaseBdev3", 00:19:20.794 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:20.794 "is_configured": true, 00:19:20.794 "data_offset": 2048, 00:19:20.794 "data_size": 63488 00:19:20.794 }, 00:19:20.794 { 00:19:20.794 "name": "BaseBdev4", 00:19:20.794 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:20.794 "is_configured": true, 00:19:20.794 "data_offset": 2048, 00:19:20.794 "data_size": 63488 00:19:20.794 } 00:19:20.794 ] 00:19:20.794 }' 00:19:20.794 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.794 10:14:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.366 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.366 10:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:21.366 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:21.366 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:21.627 [2024-06-10 10:14:43.334597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:21.627 BaseBdev1 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:21.627 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:21.888 [ 00:19:21.888 { 00:19:21.888 "name": "BaseBdev1", 00:19:21.888 "aliases": [ 00:19:21.888 "73eebecc-d5f2-46cd-8b2e-4435be6e4b18" 00:19:21.888 ], 00:19:21.888 "product_name": "Malloc disk", 00:19:21.888 "block_size": 512, 00:19:21.888 "num_blocks": 65536, 00:19:21.888 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:21.888 "assigned_rate_limits": { 00:19:21.888 "rw_ios_per_sec": 0, 00:19:21.888 "rw_mbytes_per_sec": 0, 00:19:21.888 "r_mbytes_per_sec": 0, 00:19:21.888 "w_mbytes_per_sec": 0 00:19:21.888 }, 00:19:21.888 "claimed": true, 00:19:21.888 "claim_type": "exclusive_write", 00:19:21.888 "zoned": false, 00:19:21.888 "supported_io_types": { 00:19:21.888 "read": true, 00:19:21.888 "write": true, 00:19:21.888 "unmap": true, 00:19:21.888 "write_zeroes": true, 00:19:21.888 "flush": true, 00:19:21.888 "reset": true, 00:19:21.888 "compare": false, 00:19:21.888 "compare_and_write": false, 00:19:21.888 "abort": true, 00:19:21.888 "nvme_admin": false, 00:19:21.888 "nvme_io": false 00:19:21.888 }, 00:19:21.888 "memory_domains": [ 00:19:21.888 { 00:19:21.888 "dma_device_id": "system", 00:19:21.888 "dma_device_type": 1 00:19:21.888 }, 00:19:21.888 { 00:19:21.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.888 "dma_device_type": 2 00:19:21.888 } 00:19:21.888 ], 00:19:21.888 "driver_specific": {} 00:19:21.888 } 00:19:21.888 ] 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:21.888 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.889 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.150 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.150 "name": "Existed_Raid", 00:19:22.150 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:22.150 "strip_size_kb": 0, 00:19:22.150 "state": "configuring", 00:19:22.150 "raid_level": "raid1", 00:19:22.150 "superblock": true, 00:19:22.150 "num_base_bdevs": 4, 00:19:22.150 "num_base_bdevs_discovered": 3, 00:19:22.150 "num_base_bdevs_operational": 4, 00:19:22.150 "base_bdevs_list": [ 00:19:22.150 { 00:19:22.150 "name": "BaseBdev1", 00:19:22.150 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:22.150 "is_configured": true, 00:19:22.150 "data_offset": 2048, 00:19:22.150 "data_size": 63488 00:19:22.150 }, 00:19:22.150 { 00:19:22.150 "name": null, 00:19:22.150 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:22.150 "is_configured": false, 00:19:22.150 "data_offset": 2048, 00:19:22.150 "data_size": 63488 00:19:22.150 }, 00:19:22.150 { 00:19:22.150 "name": "BaseBdev3", 00:19:22.150 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:22.150 "is_configured": true, 00:19:22.150 "data_offset": 2048, 00:19:22.150 "data_size": 63488 00:19:22.150 }, 00:19:22.150 { 00:19:22.150 "name": "BaseBdev4", 00:19:22.150 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:22.150 "is_configured": true, 00:19:22.150 "data_offset": 2048, 00:19:22.150 "data_size": 63488 00:19:22.150 } 00:19:22.150 ] 00:19:22.150 }' 00:19:22.150 10:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.150 10:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:22.722 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.722 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:22.983 [2024-06-10 10:14:44.826388] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.983 10:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.244 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.244 "name": "Existed_Raid", 00:19:23.244 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:23.244 "strip_size_kb": 0, 00:19:23.244 "state": "configuring", 00:19:23.244 "raid_level": "raid1", 00:19:23.244 "superblock": true, 00:19:23.244 "num_base_bdevs": 4, 00:19:23.244 "num_base_bdevs_discovered": 2, 00:19:23.244 "num_base_bdevs_operational": 4, 00:19:23.244 "base_bdevs_list": [ 00:19:23.244 { 00:19:23.244 "name": "BaseBdev1", 00:19:23.244 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:23.244 "is_configured": true, 00:19:23.244 "data_offset": 2048, 00:19:23.244 "data_size": 63488 00:19:23.244 }, 00:19:23.244 { 00:19:23.244 "name": null, 00:19:23.244 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:23.244 "is_configured": false, 00:19:23.244 "data_offset": 2048, 00:19:23.244 "data_size": 63488 00:19:23.244 }, 00:19:23.244 { 00:19:23.244 "name": null, 00:19:23.244 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:23.244 "is_configured": false, 00:19:23.244 "data_offset": 2048, 00:19:23.244 "data_size": 63488 00:19:23.244 }, 00:19:23.244 { 00:19:23.244 "name": "BaseBdev4", 00:19:23.244 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:23.244 "is_configured": true, 00:19:23.244 "data_offset": 2048, 00:19:23.244 "data_size": 63488 00:19:23.244 } 00:19:23.244 ] 00:19:23.244 }' 00:19:23.244 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.244 10:14:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.816 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.816 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:24.076 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:24.077 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:24.077 [2024-06-10 10:14:45.937217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:24.336 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:24.336 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.336 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.337 10:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.337 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.337 "name": "Existed_Raid", 00:19:24.337 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:24.337 "strip_size_kb": 0, 00:19:24.337 "state": "configuring", 00:19:24.337 "raid_level": "raid1", 00:19:24.337 "superblock": true, 00:19:24.337 "num_base_bdevs": 4, 00:19:24.337 "num_base_bdevs_discovered": 3, 00:19:24.337 "num_base_bdevs_operational": 4, 00:19:24.337 "base_bdevs_list": [ 00:19:24.337 { 00:19:24.337 "name": "BaseBdev1", 00:19:24.337 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:24.337 "is_configured": true, 00:19:24.337 "data_offset": 2048, 00:19:24.337 "data_size": 63488 00:19:24.337 }, 00:19:24.337 { 00:19:24.337 "name": null, 00:19:24.337 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:24.337 "is_configured": false, 00:19:24.337 "data_offset": 2048, 00:19:24.337 "data_size": 63488 00:19:24.337 }, 00:19:24.337 { 00:19:24.337 "name": "BaseBdev3", 00:19:24.337 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:24.337 "is_configured": true, 00:19:24.337 "data_offset": 2048, 00:19:24.337 "data_size": 63488 00:19:24.337 }, 00:19:24.337 { 00:19:24.337 "name": "BaseBdev4", 00:19:24.337 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:24.337 "is_configured": true, 00:19:24.337 "data_offset": 2048, 00:19:24.337 "data_size": 63488 00:19:24.337 } 00:19:24.337 ] 00:19:24.337 }' 00:19:24.337 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.337 10:14:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.924 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.924 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:25.231 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:25.231 10:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.231 [2024-06-10 10:14:47.031988] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.231 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.492 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.492 "name": "Existed_Raid", 00:19:25.492 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:25.492 "strip_size_kb": 0, 00:19:25.492 "state": "configuring", 00:19:25.492 "raid_level": "raid1", 00:19:25.492 "superblock": true, 00:19:25.492 "num_base_bdevs": 4, 00:19:25.492 "num_base_bdevs_discovered": 2, 00:19:25.492 "num_base_bdevs_operational": 4, 00:19:25.492 "base_bdevs_list": [ 00:19:25.492 { 00:19:25.492 "name": null, 00:19:25.492 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:25.492 "is_configured": false, 00:19:25.492 "data_offset": 2048, 00:19:25.492 "data_size": 63488 00:19:25.492 }, 00:19:25.492 { 00:19:25.492 "name": null, 00:19:25.492 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:25.492 "is_configured": false, 00:19:25.492 "data_offset": 2048, 00:19:25.492 "data_size": 63488 00:19:25.492 }, 00:19:25.492 { 00:19:25.492 "name": "BaseBdev3", 00:19:25.492 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:25.492 "is_configured": true, 00:19:25.492 "data_offset": 2048, 00:19:25.492 "data_size": 63488 00:19:25.492 }, 00:19:25.492 { 00:19:25.492 "name": "BaseBdev4", 00:19:25.492 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:25.493 "is_configured": true, 00:19:25.493 "data_offset": 2048, 00:19:25.493 "data_size": 63488 00:19:25.493 } 00:19:25.493 ] 00:19:25.493 }' 00:19:25.493 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.493 10:14:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:26.062 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.062 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:26.322 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:26.322 10:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:26.322 [2024-06-10 10:14:48.140350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.322 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.582 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.582 "name": "Existed_Raid", 00:19:26.582 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:26.582 "strip_size_kb": 0, 00:19:26.582 "state": "configuring", 00:19:26.582 "raid_level": "raid1", 00:19:26.582 "superblock": true, 00:19:26.582 "num_base_bdevs": 4, 00:19:26.582 "num_base_bdevs_discovered": 3, 00:19:26.582 "num_base_bdevs_operational": 4, 00:19:26.582 "base_bdevs_list": [ 00:19:26.582 { 00:19:26.582 "name": null, 00:19:26.582 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:26.582 "is_configured": false, 00:19:26.582 "data_offset": 2048, 00:19:26.582 "data_size": 63488 00:19:26.582 }, 00:19:26.582 { 00:19:26.582 "name": "BaseBdev2", 00:19:26.582 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:26.582 "is_configured": true, 00:19:26.582 "data_offset": 2048, 00:19:26.582 "data_size": 63488 00:19:26.582 }, 00:19:26.582 { 00:19:26.583 "name": "BaseBdev3", 00:19:26.583 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:26.583 "is_configured": true, 00:19:26.583 "data_offset": 2048, 00:19:26.583 "data_size": 63488 00:19:26.583 }, 00:19:26.583 { 00:19:26.583 "name": "BaseBdev4", 00:19:26.583 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:26.583 "is_configured": true, 00:19:26.583 "data_offset": 2048, 00:19:26.583 "data_size": 63488 00:19:26.583 } 00:19:26.583 ] 00:19:26.583 }' 00:19:26.583 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.583 10:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.153 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.153 10:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:27.413 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:27.413 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.413 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:27.413 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 73eebecc-d5f2-46cd-8b2e-4435be6e4b18 00:19:27.673 [2024-06-10 10:14:49.452674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:27.673 [2024-06-10 10:14:49.452794] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25570f0 00:19:27.673 [2024-06-10 10:14:49.452801] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:27.673 [2024-06-10 10:14:49.452947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2254240 00:19:27.673 [2024-06-10 10:14:49.453040] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25570f0 00:19:27.673 [2024-06-10 10:14:49.453046] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25570f0 00:19:27.673 [2024-06-10 10:14:49.453112] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:27.673 NewBaseBdev 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:27.673 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.933 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:28.195 [ 00:19:28.195 { 00:19:28.195 "name": "NewBaseBdev", 00:19:28.195 "aliases": [ 00:19:28.195 "73eebecc-d5f2-46cd-8b2e-4435be6e4b18" 00:19:28.195 ], 00:19:28.195 "product_name": "Malloc disk", 00:19:28.195 "block_size": 512, 00:19:28.195 "num_blocks": 65536, 00:19:28.195 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:28.195 "assigned_rate_limits": { 00:19:28.195 "rw_ios_per_sec": 0, 00:19:28.195 "rw_mbytes_per_sec": 0, 00:19:28.195 "r_mbytes_per_sec": 0, 00:19:28.195 "w_mbytes_per_sec": 0 00:19:28.195 }, 00:19:28.195 "claimed": true, 00:19:28.195 "claim_type": "exclusive_write", 00:19:28.195 "zoned": false, 00:19:28.195 "supported_io_types": { 00:19:28.195 "read": true, 00:19:28.195 "write": true, 00:19:28.195 "unmap": true, 00:19:28.195 "write_zeroes": true, 00:19:28.195 "flush": true, 00:19:28.195 "reset": true, 00:19:28.195 "compare": false, 00:19:28.195 "compare_and_write": false, 00:19:28.195 "abort": true, 00:19:28.195 "nvme_admin": false, 00:19:28.195 "nvme_io": false 00:19:28.195 }, 00:19:28.195 "memory_domains": [ 00:19:28.195 { 00:19:28.195 "dma_device_id": "system", 00:19:28.195 "dma_device_type": 1 00:19:28.195 }, 00:19:28.195 { 00:19:28.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.195 "dma_device_type": 2 00:19:28.195 } 00:19:28.195 ], 00:19:28.195 "driver_specific": {} 00:19:28.195 } 00:19:28.195 ] 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.195 10:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:28.195 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.195 "name": "Existed_Raid", 00:19:28.195 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:28.195 "strip_size_kb": 0, 00:19:28.195 "state": "online", 00:19:28.195 "raid_level": "raid1", 00:19:28.195 "superblock": true, 00:19:28.195 "num_base_bdevs": 4, 00:19:28.195 "num_base_bdevs_discovered": 4, 00:19:28.195 "num_base_bdevs_operational": 4, 00:19:28.195 "base_bdevs_list": [ 00:19:28.195 { 00:19:28.195 "name": "NewBaseBdev", 00:19:28.195 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:28.195 "is_configured": true, 00:19:28.195 "data_offset": 2048, 00:19:28.195 "data_size": 63488 00:19:28.195 }, 00:19:28.195 { 00:19:28.195 "name": "BaseBdev2", 00:19:28.195 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:28.195 "is_configured": true, 00:19:28.195 "data_offset": 2048, 00:19:28.195 "data_size": 63488 00:19:28.195 }, 00:19:28.195 { 00:19:28.195 "name": "BaseBdev3", 00:19:28.195 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:28.195 "is_configured": true, 00:19:28.195 "data_offset": 2048, 00:19:28.196 "data_size": 63488 00:19:28.196 }, 00:19:28.196 { 00:19:28.196 "name": "BaseBdev4", 00:19:28.196 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:28.196 "is_configured": true, 00:19:28.196 "data_offset": 2048, 00:19:28.196 "data_size": 63488 00:19:28.196 } 00:19:28.196 ] 00:19:28.196 }' 00:19:28.196 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.196 10:14:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:28.766 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:29.026 [2024-06-10 10:14:50.708229] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.026 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:29.027 "name": "Existed_Raid", 00:19:29.027 "aliases": [ 00:19:29.027 "1a3b09ee-577e-492e-ae07-2fe899326d74" 00:19:29.027 ], 00:19:29.027 "product_name": "Raid Volume", 00:19:29.027 "block_size": 512, 00:19:29.027 "num_blocks": 63488, 00:19:29.027 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:29.027 "assigned_rate_limits": { 00:19:29.027 "rw_ios_per_sec": 0, 00:19:29.027 "rw_mbytes_per_sec": 0, 00:19:29.027 "r_mbytes_per_sec": 0, 00:19:29.027 "w_mbytes_per_sec": 0 00:19:29.027 }, 00:19:29.027 "claimed": false, 00:19:29.027 "zoned": false, 00:19:29.027 "supported_io_types": { 00:19:29.027 "read": true, 00:19:29.027 "write": true, 00:19:29.027 "unmap": false, 00:19:29.027 "write_zeroes": true, 00:19:29.027 "flush": false, 00:19:29.027 "reset": true, 00:19:29.027 "compare": false, 00:19:29.027 "compare_and_write": false, 00:19:29.027 "abort": false, 00:19:29.027 "nvme_admin": false, 00:19:29.027 "nvme_io": false 00:19:29.027 }, 00:19:29.027 "memory_domains": [ 00:19:29.027 { 00:19:29.027 "dma_device_id": "system", 00:19:29.027 "dma_device_type": 1 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.027 "dma_device_type": 2 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "system", 00:19:29.027 "dma_device_type": 1 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.027 "dma_device_type": 2 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "system", 00:19:29.027 "dma_device_type": 1 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.027 "dma_device_type": 2 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "system", 00:19:29.027 "dma_device_type": 1 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.027 "dma_device_type": 2 00:19:29.027 } 00:19:29.027 ], 00:19:29.027 "driver_specific": { 00:19:29.027 "raid": { 00:19:29.027 "uuid": "1a3b09ee-577e-492e-ae07-2fe899326d74", 00:19:29.027 "strip_size_kb": 0, 00:19:29.027 "state": "online", 00:19:29.027 "raid_level": "raid1", 00:19:29.027 "superblock": true, 00:19:29.027 "num_base_bdevs": 4, 00:19:29.027 "num_base_bdevs_discovered": 4, 00:19:29.027 "num_base_bdevs_operational": 4, 00:19:29.027 "base_bdevs_list": [ 00:19:29.027 { 00:19:29.027 "name": "NewBaseBdev", 00:19:29.027 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:29.027 "is_configured": true, 00:19:29.027 "data_offset": 2048, 00:19:29.027 "data_size": 63488 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "name": "BaseBdev2", 00:19:29.027 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:29.027 "is_configured": true, 00:19:29.027 "data_offset": 2048, 00:19:29.027 "data_size": 63488 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "name": "BaseBdev3", 00:19:29.027 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:29.027 "is_configured": true, 00:19:29.027 "data_offset": 2048, 00:19:29.027 "data_size": 63488 00:19:29.027 }, 00:19:29.027 { 00:19:29.027 "name": "BaseBdev4", 00:19:29.027 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:29.027 "is_configured": true, 00:19:29.027 "data_offset": 2048, 00:19:29.027 "data_size": 63488 00:19:29.027 } 00:19:29.027 ] 00:19:29.027 } 00:19:29.027 } 00:19:29.027 }' 00:19:29.027 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:29.027 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:29.027 BaseBdev2 00:19:29.027 BaseBdev3 00:19:29.027 BaseBdev4' 00:19:29.027 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.027 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:29.027 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.287 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.287 "name": "NewBaseBdev", 00:19:29.287 "aliases": [ 00:19:29.287 "73eebecc-d5f2-46cd-8b2e-4435be6e4b18" 00:19:29.287 ], 00:19:29.287 "product_name": "Malloc disk", 00:19:29.287 "block_size": 512, 00:19:29.287 "num_blocks": 65536, 00:19:29.287 "uuid": "73eebecc-d5f2-46cd-8b2e-4435be6e4b18", 00:19:29.287 "assigned_rate_limits": { 00:19:29.287 "rw_ios_per_sec": 0, 00:19:29.287 "rw_mbytes_per_sec": 0, 00:19:29.287 "r_mbytes_per_sec": 0, 00:19:29.287 "w_mbytes_per_sec": 0 00:19:29.287 }, 00:19:29.287 "claimed": true, 00:19:29.287 "claim_type": "exclusive_write", 00:19:29.287 "zoned": false, 00:19:29.287 "supported_io_types": { 00:19:29.287 "read": true, 00:19:29.287 "write": true, 00:19:29.287 "unmap": true, 00:19:29.287 "write_zeroes": true, 00:19:29.287 "flush": true, 00:19:29.287 "reset": true, 00:19:29.287 "compare": false, 00:19:29.287 "compare_and_write": false, 00:19:29.287 "abort": true, 00:19:29.287 "nvme_admin": false, 00:19:29.287 "nvme_io": false 00:19:29.287 }, 00:19:29.287 "memory_domains": [ 00:19:29.287 { 00:19:29.287 "dma_device_id": "system", 00:19:29.287 "dma_device_type": 1 00:19:29.287 }, 00:19:29.287 { 00:19:29.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.287 "dma_device_type": 2 00:19:29.287 } 00:19:29.287 ], 00:19:29.287 "driver_specific": {} 00:19:29.287 }' 00:19:29.287 10:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.287 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:29.547 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.807 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.807 "name": "BaseBdev2", 00:19:29.807 "aliases": [ 00:19:29.807 "c1063e0c-f833-4365-8f69-4ce319a12052" 00:19:29.807 ], 00:19:29.807 "product_name": "Malloc disk", 00:19:29.807 "block_size": 512, 00:19:29.807 "num_blocks": 65536, 00:19:29.807 "uuid": "c1063e0c-f833-4365-8f69-4ce319a12052", 00:19:29.807 "assigned_rate_limits": { 00:19:29.807 "rw_ios_per_sec": 0, 00:19:29.807 "rw_mbytes_per_sec": 0, 00:19:29.807 "r_mbytes_per_sec": 0, 00:19:29.807 "w_mbytes_per_sec": 0 00:19:29.807 }, 00:19:29.807 "claimed": true, 00:19:29.807 "claim_type": "exclusive_write", 00:19:29.807 "zoned": false, 00:19:29.807 "supported_io_types": { 00:19:29.807 "read": true, 00:19:29.807 "write": true, 00:19:29.807 "unmap": true, 00:19:29.807 "write_zeroes": true, 00:19:29.807 "flush": true, 00:19:29.807 "reset": true, 00:19:29.807 "compare": false, 00:19:29.807 "compare_and_write": false, 00:19:29.807 "abort": true, 00:19:29.807 "nvme_admin": false, 00:19:29.807 "nvme_io": false 00:19:29.807 }, 00:19:29.807 "memory_domains": [ 00:19:29.807 { 00:19:29.807 "dma_device_id": "system", 00:19:29.807 "dma_device_type": 1 00:19:29.807 }, 00:19:29.807 { 00:19:29.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.807 "dma_device_type": 2 00:19:29.807 } 00:19:29.807 ], 00:19:29.807 "driver_specific": {} 00:19:29.807 }' 00:19:29.807 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.807 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.807 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.807 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.808 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.808 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.808 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:30.068 10:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.327 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.327 "name": "BaseBdev3", 00:19:30.327 "aliases": [ 00:19:30.327 "af1ea89b-374a-4621-bb79-dc65428661b8" 00:19:30.327 ], 00:19:30.327 "product_name": "Malloc disk", 00:19:30.327 "block_size": 512, 00:19:30.327 "num_blocks": 65536, 00:19:30.327 "uuid": "af1ea89b-374a-4621-bb79-dc65428661b8", 00:19:30.327 "assigned_rate_limits": { 00:19:30.328 "rw_ios_per_sec": 0, 00:19:30.328 "rw_mbytes_per_sec": 0, 00:19:30.328 "r_mbytes_per_sec": 0, 00:19:30.328 "w_mbytes_per_sec": 0 00:19:30.328 }, 00:19:30.328 "claimed": true, 00:19:30.328 "claim_type": "exclusive_write", 00:19:30.328 "zoned": false, 00:19:30.328 "supported_io_types": { 00:19:30.328 "read": true, 00:19:30.328 "write": true, 00:19:30.328 "unmap": true, 00:19:30.328 "write_zeroes": true, 00:19:30.328 "flush": true, 00:19:30.328 "reset": true, 00:19:30.328 "compare": false, 00:19:30.328 "compare_and_write": false, 00:19:30.328 "abort": true, 00:19:30.328 "nvme_admin": false, 00:19:30.328 "nvme_io": false 00:19:30.328 }, 00:19:30.328 "memory_domains": [ 00:19:30.328 { 00:19:30.328 "dma_device_id": "system", 00:19:30.328 "dma_device_type": 1 00:19:30.328 }, 00:19:30.328 { 00:19:30.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.328 "dma_device_type": 2 00:19:30.328 } 00:19:30.328 ], 00:19:30.328 "driver_specific": {} 00:19:30.328 }' 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.328 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:30.587 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.847 "name": "BaseBdev4", 00:19:30.847 "aliases": [ 00:19:30.847 "d9243707-a64f-4c88-8763-195e82af77cf" 00:19:30.847 ], 00:19:30.847 "product_name": "Malloc disk", 00:19:30.847 "block_size": 512, 00:19:30.847 "num_blocks": 65536, 00:19:30.847 "uuid": "d9243707-a64f-4c88-8763-195e82af77cf", 00:19:30.847 "assigned_rate_limits": { 00:19:30.847 "rw_ios_per_sec": 0, 00:19:30.847 "rw_mbytes_per_sec": 0, 00:19:30.847 "r_mbytes_per_sec": 0, 00:19:30.847 "w_mbytes_per_sec": 0 00:19:30.847 }, 00:19:30.847 "claimed": true, 00:19:30.847 "claim_type": "exclusive_write", 00:19:30.847 "zoned": false, 00:19:30.847 "supported_io_types": { 00:19:30.847 "read": true, 00:19:30.847 "write": true, 00:19:30.847 "unmap": true, 00:19:30.847 "write_zeroes": true, 00:19:30.847 "flush": true, 00:19:30.847 "reset": true, 00:19:30.847 "compare": false, 00:19:30.847 "compare_and_write": false, 00:19:30.847 "abort": true, 00:19:30.847 "nvme_admin": false, 00:19:30.847 "nvme_io": false 00:19:30.847 }, 00:19:30.847 "memory_domains": [ 00:19:30.847 { 00:19:30.847 "dma_device_id": "system", 00:19:30.847 "dma_device_type": 1 00:19:30.847 }, 00:19:30.847 { 00:19:30.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.847 "dma_device_type": 2 00:19:30.847 } 00:19:30.847 ], 00:19:30.847 "driver_specific": {} 00:19:30.847 }' 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.847 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.107 10:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:31.368 [2024-06-10 10:14:53.037895] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:31.368 [2024-06-10 10:14:53.037916] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:31.368 [2024-06-10 10:14:53.037956] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:31.368 [2024-06-10 10:14:53.038166] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:31.368 [2024-06-10 10:14:53.038173] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25570f0 name Existed_Raid, state offline 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1057447 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1057447 ']' 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1057447 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1057447 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1057447' 00:19:31.368 killing process with pid 1057447 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1057447 00:19:31.368 [2024-06-10 10:14:53.106406] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:31.368 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1057447 00:19:31.368 [2024-06-10 10:14:53.126785] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:31.628 10:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:31.628 00:19:31.628 real 0m26.891s 00:19:31.628 user 0m50.343s 00:19:31.628 sys 0m4.041s 00:19:31.628 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:31.628 10:14:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:31.628 ************************************ 00:19:31.628 END TEST raid_state_function_test_sb 00:19:31.628 ************************************ 00:19:31.628 10:14:53 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:19:31.628 10:14:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:19:31.628 10:14:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:31.628 10:14:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:31.628 ************************************ 00:19:31.628 START TEST raid_superblock_test 00:19:31.628 ************************************ 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 4 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1062651 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1062651 /var/tmp/spdk-raid.sock 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1062651 ']' 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:31.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:31.628 10:14:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.628 [2024-06-10 10:14:53.378594] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:19:31.628 [2024-06-10 10:14:53.378641] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1062651 ] 00:19:31.628 [2024-06-10 10:14:53.466233] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:31.888 [2024-06-10 10:14:53.530077] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:31.888 [2024-06-10 10:14:53.571914] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:31.888 [2024-06-10 10:14:53.571937] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:32.458 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:32.718 malloc1 00:19:32.718 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:32.718 [2024-06-10 10:14:54.581982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:32.718 [2024-06-10 10:14:54.582016] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:32.718 [2024-06-10 10:14:54.582028] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3c990 00:19:32.718 [2024-06-10 10:14:54.582039] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.718 [2024-06-10 10:14:54.583351] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.718 [2024-06-10 10:14:54.583371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:32.978 pt1 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:32.978 malloc2 00:19:32.978 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:33.239 [2024-06-10 10:14:54.965008] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:33.239 [2024-06-10 10:14:54.965037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.239 [2024-06-10 10:14:54.965047] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3d4e0 00:19:33.239 [2024-06-10 10:14:54.965053] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.239 [2024-06-10 10:14:54.966246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.239 [2024-06-10 10:14:54.966264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:33.239 pt2 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:33.239 10:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:33.498 malloc3 00:19:33.498 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:33.498 [2024-06-10 10:14:55.331710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:33.498 [2024-06-10 10:14:55.331739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.498 [2024-06-10 10:14:55.331748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be94e0 00:19:33.498 [2024-06-10 10:14:55.331754] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.498 [2024-06-10 10:14:55.332935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.498 [2024-06-10 10:14:55.332953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:33.498 pt3 00:19:33.498 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:33.498 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:33.499 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:33.758 malloc4 00:19:33.759 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:34.019 [2024-06-10 10:14:55.718556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:34.019 [2024-06-10 10:14:55.718583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:34.019 [2024-06-10 10:14:55.718592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bebf50 00:19:34.019 [2024-06-10 10:14:55.718598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:34.019 [2024-06-10 10:14:55.719769] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:34.019 [2024-06-10 10:14:55.719786] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:34.019 pt4 00:19:34.019 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:34.019 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:34.019 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:34.279 [2024-06-10 10:14:55.907047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:34.279 [2024-06-10 10:14:55.908045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:34.279 [2024-06-10 10:14:55.908087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:34.279 [2024-06-10 10:14:55.908119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:34.279 [2024-06-10 10:14:55.908256] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bedfb0 00:19:34.279 [2024-06-10 10:14:55.908263] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:34.279 [2024-06-10 10:14:55.908411] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3ddf0 00:19:34.279 [2024-06-10 10:14:55.908525] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bedfb0 00:19:34.279 [2024-06-10 10:14:55.908530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bedfb0 00:19:34.279 [2024-06-10 10:14:55.908599] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.279 10:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.279 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.279 "name": "raid_bdev1", 00:19:34.279 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:34.279 "strip_size_kb": 0, 00:19:34.279 "state": "online", 00:19:34.279 "raid_level": "raid1", 00:19:34.279 "superblock": true, 00:19:34.279 "num_base_bdevs": 4, 00:19:34.279 "num_base_bdevs_discovered": 4, 00:19:34.279 "num_base_bdevs_operational": 4, 00:19:34.279 "base_bdevs_list": [ 00:19:34.279 { 00:19:34.279 "name": "pt1", 00:19:34.279 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:34.279 "is_configured": true, 00:19:34.279 "data_offset": 2048, 00:19:34.279 "data_size": 63488 00:19:34.279 }, 00:19:34.279 { 00:19:34.279 "name": "pt2", 00:19:34.279 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:34.279 "is_configured": true, 00:19:34.279 "data_offset": 2048, 00:19:34.279 "data_size": 63488 00:19:34.279 }, 00:19:34.279 { 00:19:34.279 "name": "pt3", 00:19:34.279 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:34.279 "is_configured": true, 00:19:34.279 "data_offset": 2048, 00:19:34.279 "data_size": 63488 00:19:34.279 }, 00:19:34.279 { 00:19:34.279 "name": "pt4", 00:19:34.279 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:34.279 "is_configured": true, 00:19:34.279 "data_offset": 2048, 00:19:34.279 "data_size": 63488 00:19:34.279 } 00:19:34.279 ] 00:19:34.279 }' 00:19:34.279 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.279 10:14:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:34.849 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:35.110 [2024-06-10 10:14:56.809526] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:35.110 "name": "raid_bdev1", 00:19:35.110 "aliases": [ 00:19:35.110 "84bf5243-56ff-4c1a-a053-fe75f8fc8f52" 00:19:35.110 ], 00:19:35.110 "product_name": "Raid Volume", 00:19:35.110 "block_size": 512, 00:19:35.110 "num_blocks": 63488, 00:19:35.110 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:35.110 "assigned_rate_limits": { 00:19:35.110 "rw_ios_per_sec": 0, 00:19:35.110 "rw_mbytes_per_sec": 0, 00:19:35.110 "r_mbytes_per_sec": 0, 00:19:35.110 "w_mbytes_per_sec": 0 00:19:35.110 }, 00:19:35.110 "claimed": false, 00:19:35.110 "zoned": false, 00:19:35.110 "supported_io_types": { 00:19:35.110 "read": true, 00:19:35.110 "write": true, 00:19:35.110 "unmap": false, 00:19:35.110 "write_zeroes": true, 00:19:35.110 "flush": false, 00:19:35.110 "reset": true, 00:19:35.110 "compare": false, 00:19:35.110 "compare_and_write": false, 00:19:35.110 "abort": false, 00:19:35.110 "nvme_admin": false, 00:19:35.110 "nvme_io": false 00:19:35.110 }, 00:19:35.110 "memory_domains": [ 00:19:35.110 { 00:19:35.110 "dma_device_id": "system", 00:19:35.110 "dma_device_type": 1 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.110 "dma_device_type": 2 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "system", 00:19:35.110 "dma_device_type": 1 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.110 "dma_device_type": 2 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "system", 00:19:35.110 "dma_device_type": 1 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.110 "dma_device_type": 2 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "system", 00:19:35.110 "dma_device_type": 1 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.110 "dma_device_type": 2 00:19:35.110 } 00:19:35.110 ], 00:19:35.110 "driver_specific": { 00:19:35.110 "raid": { 00:19:35.110 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:35.110 "strip_size_kb": 0, 00:19:35.110 "state": "online", 00:19:35.110 "raid_level": "raid1", 00:19:35.110 "superblock": true, 00:19:35.110 "num_base_bdevs": 4, 00:19:35.110 "num_base_bdevs_discovered": 4, 00:19:35.110 "num_base_bdevs_operational": 4, 00:19:35.110 "base_bdevs_list": [ 00:19:35.110 { 00:19:35.110 "name": "pt1", 00:19:35.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:35.110 "is_configured": true, 00:19:35.110 "data_offset": 2048, 00:19:35.110 "data_size": 63488 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "name": "pt2", 00:19:35.110 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:35.110 "is_configured": true, 00:19:35.110 "data_offset": 2048, 00:19:35.110 "data_size": 63488 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "name": "pt3", 00:19:35.110 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:35.110 "is_configured": true, 00:19:35.110 "data_offset": 2048, 00:19:35.110 "data_size": 63488 00:19:35.110 }, 00:19:35.110 { 00:19:35.110 "name": "pt4", 00:19:35.110 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:35.110 "is_configured": true, 00:19:35.110 "data_offset": 2048, 00:19:35.110 "data_size": 63488 00:19:35.110 } 00:19:35.110 ] 00:19:35.110 } 00:19:35.110 } 00:19:35.110 }' 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:35.110 pt2 00:19:35.110 pt3 00:19:35.110 pt4' 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:35.110 10:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.370 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.370 "name": "pt1", 00:19:35.370 "aliases": [ 00:19:35.370 "00000000-0000-0000-0000-000000000001" 00:19:35.370 ], 00:19:35.370 "product_name": "passthru", 00:19:35.370 "block_size": 512, 00:19:35.370 "num_blocks": 65536, 00:19:35.370 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:35.370 "assigned_rate_limits": { 00:19:35.370 "rw_ios_per_sec": 0, 00:19:35.370 "rw_mbytes_per_sec": 0, 00:19:35.370 "r_mbytes_per_sec": 0, 00:19:35.370 "w_mbytes_per_sec": 0 00:19:35.370 }, 00:19:35.370 "claimed": true, 00:19:35.370 "claim_type": "exclusive_write", 00:19:35.370 "zoned": false, 00:19:35.370 "supported_io_types": { 00:19:35.370 "read": true, 00:19:35.370 "write": true, 00:19:35.370 "unmap": true, 00:19:35.370 "write_zeroes": true, 00:19:35.370 "flush": true, 00:19:35.370 "reset": true, 00:19:35.370 "compare": false, 00:19:35.370 "compare_and_write": false, 00:19:35.370 "abort": true, 00:19:35.370 "nvme_admin": false, 00:19:35.370 "nvme_io": false 00:19:35.370 }, 00:19:35.370 "memory_domains": [ 00:19:35.370 { 00:19:35.370 "dma_device_id": "system", 00:19:35.370 "dma_device_type": 1 00:19:35.370 }, 00:19:35.370 { 00:19:35.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.370 "dma_device_type": 2 00:19:35.370 } 00:19:35.370 ], 00:19:35.370 "driver_specific": { 00:19:35.370 "passthru": { 00:19:35.370 "name": "pt1", 00:19:35.370 "base_bdev_name": "malloc1" 00:19:35.370 } 00:19:35.370 } 00:19:35.370 }' 00:19:35.370 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.370 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.370 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:35.371 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.371 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:35.631 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.891 "name": "pt2", 00:19:35.891 "aliases": [ 00:19:35.891 "00000000-0000-0000-0000-000000000002" 00:19:35.891 ], 00:19:35.891 "product_name": "passthru", 00:19:35.891 "block_size": 512, 00:19:35.891 "num_blocks": 65536, 00:19:35.891 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:35.891 "assigned_rate_limits": { 00:19:35.891 "rw_ios_per_sec": 0, 00:19:35.891 "rw_mbytes_per_sec": 0, 00:19:35.891 "r_mbytes_per_sec": 0, 00:19:35.891 "w_mbytes_per_sec": 0 00:19:35.891 }, 00:19:35.891 "claimed": true, 00:19:35.891 "claim_type": "exclusive_write", 00:19:35.891 "zoned": false, 00:19:35.891 "supported_io_types": { 00:19:35.891 "read": true, 00:19:35.891 "write": true, 00:19:35.891 "unmap": true, 00:19:35.891 "write_zeroes": true, 00:19:35.891 "flush": true, 00:19:35.891 "reset": true, 00:19:35.891 "compare": false, 00:19:35.891 "compare_and_write": false, 00:19:35.891 "abort": true, 00:19:35.891 "nvme_admin": false, 00:19:35.891 "nvme_io": false 00:19:35.891 }, 00:19:35.891 "memory_domains": [ 00:19:35.891 { 00:19:35.891 "dma_device_id": "system", 00:19:35.891 "dma_device_type": 1 00:19:35.891 }, 00:19:35.891 { 00:19:35.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.891 "dma_device_type": 2 00:19:35.891 } 00:19:35.891 ], 00:19:35.891 "driver_specific": { 00:19:35.891 "passthru": { 00:19:35.891 "name": "pt2", 00:19:35.891 "base_bdev_name": "malloc2" 00:19:35.891 } 00:19:35.891 } 00:19:35.891 }' 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.891 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.151 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.152 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.152 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.152 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:36.152 10:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.411 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.411 "name": "pt3", 00:19:36.411 "aliases": [ 00:19:36.412 "00000000-0000-0000-0000-000000000003" 00:19:36.412 ], 00:19:36.412 "product_name": "passthru", 00:19:36.412 "block_size": 512, 00:19:36.412 "num_blocks": 65536, 00:19:36.412 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:36.412 "assigned_rate_limits": { 00:19:36.412 "rw_ios_per_sec": 0, 00:19:36.412 "rw_mbytes_per_sec": 0, 00:19:36.412 "r_mbytes_per_sec": 0, 00:19:36.412 "w_mbytes_per_sec": 0 00:19:36.412 }, 00:19:36.412 "claimed": true, 00:19:36.412 "claim_type": "exclusive_write", 00:19:36.412 "zoned": false, 00:19:36.412 "supported_io_types": { 00:19:36.412 "read": true, 00:19:36.412 "write": true, 00:19:36.412 "unmap": true, 00:19:36.412 "write_zeroes": true, 00:19:36.412 "flush": true, 00:19:36.412 "reset": true, 00:19:36.412 "compare": false, 00:19:36.412 "compare_and_write": false, 00:19:36.412 "abort": true, 00:19:36.412 "nvme_admin": false, 00:19:36.412 "nvme_io": false 00:19:36.412 }, 00:19:36.412 "memory_domains": [ 00:19:36.412 { 00:19:36.412 "dma_device_id": "system", 00:19:36.412 "dma_device_type": 1 00:19:36.412 }, 00:19:36.412 { 00:19:36.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.412 "dma_device_type": 2 00:19:36.412 } 00:19:36.412 ], 00:19:36.412 "driver_specific": { 00:19:36.412 "passthru": { 00:19:36.412 "name": "pt3", 00:19:36.412 "base_bdev_name": "malloc3" 00:19:36.412 } 00:19:36.412 } 00:19:36.412 }' 00:19:36.412 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.412 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.412 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.412 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.412 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:36.671 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.931 "name": "pt4", 00:19:36.931 "aliases": [ 00:19:36.931 "00000000-0000-0000-0000-000000000004" 00:19:36.931 ], 00:19:36.931 "product_name": "passthru", 00:19:36.931 "block_size": 512, 00:19:36.931 "num_blocks": 65536, 00:19:36.931 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:36.931 "assigned_rate_limits": { 00:19:36.931 "rw_ios_per_sec": 0, 00:19:36.931 "rw_mbytes_per_sec": 0, 00:19:36.931 "r_mbytes_per_sec": 0, 00:19:36.931 "w_mbytes_per_sec": 0 00:19:36.931 }, 00:19:36.931 "claimed": true, 00:19:36.931 "claim_type": "exclusive_write", 00:19:36.931 "zoned": false, 00:19:36.931 "supported_io_types": { 00:19:36.931 "read": true, 00:19:36.931 "write": true, 00:19:36.931 "unmap": true, 00:19:36.931 "write_zeroes": true, 00:19:36.931 "flush": true, 00:19:36.931 "reset": true, 00:19:36.931 "compare": false, 00:19:36.931 "compare_and_write": false, 00:19:36.931 "abort": true, 00:19:36.931 "nvme_admin": false, 00:19:36.931 "nvme_io": false 00:19:36.931 }, 00:19:36.931 "memory_domains": [ 00:19:36.931 { 00:19:36.931 "dma_device_id": "system", 00:19:36.931 "dma_device_type": 1 00:19:36.931 }, 00:19:36.931 { 00:19:36.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.931 "dma_device_type": 2 00:19:36.931 } 00:19:36.931 ], 00:19:36.931 "driver_specific": { 00:19:36.931 "passthru": { 00:19:36.931 "name": "pt4", 00:19:36.931 "base_bdev_name": "malloc4" 00:19:36.931 } 00:19:36.931 } 00:19:36.931 }' 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.931 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:37.191 10:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:37.450 [2024-06-10 10:14:59.159513] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:37.450 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=84bf5243-56ff-4c1a-a053-fe75f8fc8f52 00:19:37.450 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 84bf5243-56ff-4c1a-a053-fe75f8fc8f52 ']' 00:19:37.450 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:37.709 [2024-06-10 10:14:59.335736] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:37.709 [2024-06-10 10:14:59.335750] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:37.709 [2024-06-10 10:14:59.335786] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:37.709 [2024-06-10 10:14:59.335856] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:37.709 [2024-06-10 10:14:59.335863] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bedfb0 name raid_bdev1, state offline 00:19:37.709 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.709 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:37.709 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:37.710 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:37.710 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.710 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:37.969 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.969 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:38.228 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:38.228 10:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:38.487 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:38.487 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:38.487 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:38.487 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:38.747 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:39.007 [2024-06-10 10:15:00.675071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:39.007 [2024-06-10 10:15:00.676140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:39.007 [2024-06-10 10:15:00.676173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:39.007 [2024-06-10 10:15:00.676200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:39.007 [2024-06-10 10:15:00.676234] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:39.007 [2024-06-10 10:15:00.676261] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:39.007 [2024-06-10 10:15:00.676275] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:39.007 [2024-06-10 10:15:00.676289] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:39.007 [2024-06-10 10:15:00.676299] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:39.007 [2024-06-10 10:15:00.676304] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3ba10 name raid_bdev1, state configuring 00:19:39.007 request: 00:19:39.007 { 00:19:39.007 "name": "raid_bdev1", 00:19:39.007 "raid_level": "raid1", 00:19:39.007 "base_bdevs": [ 00:19:39.007 "malloc1", 00:19:39.007 "malloc2", 00:19:39.007 "malloc3", 00:19:39.007 "malloc4" 00:19:39.007 ], 00:19:39.007 "superblock": false, 00:19:39.007 "method": "bdev_raid_create", 00:19:39.007 "req_id": 1 00:19:39.007 } 00:19:39.007 Got JSON-RPC error response 00:19:39.007 response: 00:19:39.007 { 00:19:39.007 "code": -17, 00:19:39.007 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:39.007 } 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.007 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:39.266 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:39.266 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:39.266 10:15:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:39.266 [2024-06-10 10:15:01.056000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:39.266 [2024-06-10 10:15:01.056033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.266 [2024-06-10 10:15:01.056044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bee250 00:19:39.266 [2024-06-10 10:15:01.056050] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.266 [2024-06-10 10:15:01.057319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.266 [2024-06-10 10:15:01.057338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:39.266 [2024-06-10 10:15:01.057385] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:39.266 [2024-06-10 10:15:01.057402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:39.266 pt1 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.266 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:39.526 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.526 "name": "raid_bdev1", 00:19:39.526 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:39.526 "strip_size_kb": 0, 00:19:39.526 "state": "configuring", 00:19:39.526 "raid_level": "raid1", 00:19:39.526 "superblock": true, 00:19:39.526 "num_base_bdevs": 4, 00:19:39.526 "num_base_bdevs_discovered": 1, 00:19:39.526 "num_base_bdevs_operational": 4, 00:19:39.526 "base_bdevs_list": [ 00:19:39.526 { 00:19:39.526 "name": "pt1", 00:19:39.526 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:39.526 "is_configured": true, 00:19:39.526 "data_offset": 2048, 00:19:39.526 "data_size": 63488 00:19:39.526 }, 00:19:39.526 { 00:19:39.526 "name": null, 00:19:39.526 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:39.526 "is_configured": false, 00:19:39.526 "data_offset": 2048, 00:19:39.526 "data_size": 63488 00:19:39.526 }, 00:19:39.526 { 00:19:39.526 "name": null, 00:19:39.526 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:39.526 "is_configured": false, 00:19:39.526 "data_offset": 2048, 00:19:39.526 "data_size": 63488 00:19:39.526 }, 00:19:39.526 { 00:19:39.526 "name": null, 00:19:39.526 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:39.526 "is_configured": false, 00:19:39.526 "data_offset": 2048, 00:19:39.526 "data_size": 63488 00:19:39.526 } 00:19:39.526 ] 00:19:39.526 }' 00:19:39.526 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.526 10:15:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.095 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:40.095 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:40.095 [2024-06-10 10:15:01.934231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:40.095 [2024-06-10 10:15:01.934269] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.095 [2024-06-10 10:15:01.934282] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be8670 00:19:40.095 [2024-06-10 10:15:01.934289] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.095 [2024-06-10 10:15:01.934558] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.095 [2024-06-10 10:15:01.934569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:40.095 [2024-06-10 10:15:01.934615] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:40.095 [2024-06-10 10:15:01.934627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:40.095 pt2 00:19:40.095 10:15:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:40.354 [2024-06-10 10:15:02.126717] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.354 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.614 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.614 "name": "raid_bdev1", 00:19:40.614 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:40.614 "strip_size_kb": 0, 00:19:40.614 "state": "configuring", 00:19:40.614 "raid_level": "raid1", 00:19:40.614 "superblock": true, 00:19:40.614 "num_base_bdevs": 4, 00:19:40.614 "num_base_bdevs_discovered": 1, 00:19:40.614 "num_base_bdevs_operational": 4, 00:19:40.614 "base_bdevs_list": [ 00:19:40.614 { 00:19:40.614 "name": "pt1", 00:19:40.614 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:40.614 "is_configured": true, 00:19:40.614 "data_offset": 2048, 00:19:40.614 "data_size": 63488 00:19:40.614 }, 00:19:40.614 { 00:19:40.614 "name": null, 00:19:40.614 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:40.614 "is_configured": false, 00:19:40.614 "data_offset": 2048, 00:19:40.614 "data_size": 63488 00:19:40.614 }, 00:19:40.614 { 00:19:40.614 "name": null, 00:19:40.614 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:40.614 "is_configured": false, 00:19:40.614 "data_offset": 2048, 00:19:40.614 "data_size": 63488 00:19:40.614 }, 00:19:40.614 { 00:19:40.614 "name": null, 00:19:40.614 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:40.614 "is_configured": false, 00:19:40.614 "data_offset": 2048, 00:19:40.614 "data_size": 63488 00:19:40.614 } 00:19:40.614 ] 00:19:40.614 }' 00:19:40.614 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.614 10:15:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.183 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:41.183 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.183 10:15:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:41.442 [2024-06-10 10:15:03.057086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:41.442 [2024-06-10 10:15:03.057122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.442 [2024-06-10 10:15:03.057134] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bedd10 00:19:41.442 [2024-06-10 10:15:03.057141] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.442 [2024-06-10 10:15:03.057415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.442 [2024-06-10 10:15:03.057425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:41.442 [2024-06-10 10:15:03.057469] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:41.442 [2024-06-10 10:15:03.057481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:41.442 pt2 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:41.442 [2024-06-10 10:15:03.233526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:41.442 [2024-06-10 10:15:03.233549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.442 [2024-06-10 10:15:03.233558] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3bff0 00:19:41.442 [2024-06-10 10:15:03.233563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.442 [2024-06-10 10:15:03.233781] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.442 [2024-06-10 10:15:03.233791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:41.442 [2024-06-10 10:15:03.233828] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:41.442 [2024-06-10 10:15:03.233838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:41.442 pt3 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.442 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:41.702 [2024-06-10 10:15:03.422011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:41.702 [2024-06-10 10:15:03.422032] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.702 [2024-06-10 10:15:03.422041] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3b1c0 00:19:41.702 [2024-06-10 10:15:03.422047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.702 [2024-06-10 10:15:03.422262] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.702 [2024-06-10 10:15:03.422272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:41.702 [2024-06-10 10:15:03.422304] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:41.702 [2024-06-10 10:15:03.422314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:41.702 [2024-06-10 10:15:03.422405] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bec7d0 00:19:41.702 [2024-06-10 10:15:03.422411] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:41.702 [2024-06-10 10:15:03.422541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bdd900 00:19:41.702 [2024-06-10 10:15:03.422646] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bec7d0 00:19:41.702 [2024-06-10 10:15:03.422651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bec7d0 00:19:41.702 [2024-06-10 10:15:03.422724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:41.702 pt4 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.702 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.962 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.962 "name": "raid_bdev1", 00:19:41.962 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:41.962 "strip_size_kb": 0, 00:19:41.962 "state": "online", 00:19:41.962 "raid_level": "raid1", 00:19:41.962 "superblock": true, 00:19:41.962 "num_base_bdevs": 4, 00:19:41.962 "num_base_bdevs_discovered": 4, 00:19:41.962 "num_base_bdevs_operational": 4, 00:19:41.962 "base_bdevs_list": [ 00:19:41.962 { 00:19:41.962 "name": "pt1", 00:19:41.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:41.962 "is_configured": true, 00:19:41.962 "data_offset": 2048, 00:19:41.962 "data_size": 63488 00:19:41.962 }, 00:19:41.962 { 00:19:41.962 "name": "pt2", 00:19:41.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:41.962 "is_configured": true, 00:19:41.962 "data_offset": 2048, 00:19:41.962 "data_size": 63488 00:19:41.962 }, 00:19:41.962 { 00:19:41.962 "name": "pt3", 00:19:41.962 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:41.962 "is_configured": true, 00:19:41.962 "data_offset": 2048, 00:19:41.962 "data_size": 63488 00:19:41.962 }, 00:19:41.962 { 00:19:41.962 "name": "pt4", 00:19:41.962 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:41.962 "is_configured": true, 00:19:41.962 "data_offset": 2048, 00:19:41.962 "data_size": 63488 00:19:41.962 } 00:19:41.962 ] 00:19:41.962 }' 00:19:41.962 10:15:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.962 10:15:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:42.531 [2024-06-10 10:15:04.324516] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:42.531 "name": "raid_bdev1", 00:19:42.531 "aliases": [ 00:19:42.531 "84bf5243-56ff-4c1a-a053-fe75f8fc8f52" 00:19:42.531 ], 00:19:42.531 "product_name": "Raid Volume", 00:19:42.531 "block_size": 512, 00:19:42.531 "num_blocks": 63488, 00:19:42.531 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:42.531 "assigned_rate_limits": { 00:19:42.531 "rw_ios_per_sec": 0, 00:19:42.531 "rw_mbytes_per_sec": 0, 00:19:42.531 "r_mbytes_per_sec": 0, 00:19:42.531 "w_mbytes_per_sec": 0 00:19:42.531 }, 00:19:42.531 "claimed": false, 00:19:42.531 "zoned": false, 00:19:42.531 "supported_io_types": { 00:19:42.531 "read": true, 00:19:42.531 "write": true, 00:19:42.531 "unmap": false, 00:19:42.531 "write_zeroes": true, 00:19:42.531 "flush": false, 00:19:42.531 "reset": true, 00:19:42.531 "compare": false, 00:19:42.531 "compare_and_write": false, 00:19:42.531 "abort": false, 00:19:42.531 "nvme_admin": false, 00:19:42.531 "nvme_io": false 00:19:42.531 }, 00:19:42.531 "memory_domains": [ 00:19:42.531 { 00:19:42.531 "dma_device_id": "system", 00:19:42.531 "dma_device_type": 1 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.531 "dma_device_type": 2 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "system", 00:19:42.531 "dma_device_type": 1 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.531 "dma_device_type": 2 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "system", 00:19:42.531 "dma_device_type": 1 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.531 "dma_device_type": 2 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "system", 00:19:42.531 "dma_device_type": 1 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.531 "dma_device_type": 2 00:19:42.531 } 00:19:42.531 ], 00:19:42.531 "driver_specific": { 00:19:42.531 "raid": { 00:19:42.531 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:42.531 "strip_size_kb": 0, 00:19:42.531 "state": "online", 00:19:42.531 "raid_level": "raid1", 00:19:42.531 "superblock": true, 00:19:42.531 "num_base_bdevs": 4, 00:19:42.531 "num_base_bdevs_discovered": 4, 00:19:42.531 "num_base_bdevs_operational": 4, 00:19:42.531 "base_bdevs_list": [ 00:19:42.531 { 00:19:42.531 "name": "pt1", 00:19:42.531 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.531 "is_configured": true, 00:19:42.531 "data_offset": 2048, 00:19:42.531 "data_size": 63488 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "name": "pt2", 00:19:42.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:42.531 "is_configured": true, 00:19:42.531 "data_offset": 2048, 00:19:42.531 "data_size": 63488 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "name": "pt3", 00:19:42.531 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.531 "is_configured": true, 00:19:42.531 "data_offset": 2048, 00:19:42.531 "data_size": 63488 00:19:42.531 }, 00:19:42.531 { 00:19:42.531 "name": "pt4", 00:19:42.531 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.531 "is_configured": true, 00:19:42.531 "data_offset": 2048, 00:19:42.531 "data_size": 63488 00:19:42.531 } 00:19:42.531 ] 00:19:42.531 } 00:19:42.531 } 00:19:42.531 }' 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:42.531 pt2 00:19:42.531 pt3 00:19:42.531 pt4' 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:42.531 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.791 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.791 "name": "pt1", 00:19:42.791 "aliases": [ 00:19:42.791 "00000000-0000-0000-0000-000000000001" 00:19:42.791 ], 00:19:42.791 "product_name": "passthru", 00:19:42.791 "block_size": 512, 00:19:42.791 "num_blocks": 65536, 00:19:42.791 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.791 "assigned_rate_limits": { 00:19:42.791 "rw_ios_per_sec": 0, 00:19:42.791 "rw_mbytes_per_sec": 0, 00:19:42.791 "r_mbytes_per_sec": 0, 00:19:42.791 "w_mbytes_per_sec": 0 00:19:42.791 }, 00:19:42.791 "claimed": true, 00:19:42.791 "claim_type": "exclusive_write", 00:19:42.791 "zoned": false, 00:19:42.791 "supported_io_types": { 00:19:42.791 "read": true, 00:19:42.791 "write": true, 00:19:42.791 "unmap": true, 00:19:42.791 "write_zeroes": true, 00:19:42.791 "flush": true, 00:19:42.791 "reset": true, 00:19:42.791 "compare": false, 00:19:42.791 "compare_and_write": false, 00:19:42.791 "abort": true, 00:19:42.791 "nvme_admin": false, 00:19:42.791 "nvme_io": false 00:19:42.791 }, 00:19:42.791 "memory_domains": [ 00:19:42.791 { 00:19:42.791 "dma_device_id": "system", 00:19:42.791 "dma_device_type": 1 00:19:42.791 }, 00:19:42.791 { 00:19:42.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.791 "dma_device_type": 2 00:19:42.791 } 00:19:42.791 ], 00:19:42.791 "driver_specific": { 00:19:42.791 "passthru": { 00:19:42.791 "name": "pt1", 00:19:42.791 "base_bdev_name": "malloc1" 00:19:42.791 } 00:19:42.791 } 00:19:42.791 }' 00:19:42.791 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.791 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.051 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.310 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.311 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.311 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.311 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.311 10:15:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:43.311 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.311 "name": "pt2", 00:19:43.311 "aliases": [ 00:19:43.311 "00000000-0000-0000-0000-000000000002" 00:19:43.311 ], 00:19:43.311 "product_name": "passthru", 00:19:43.311 "block_size": 512, 00:19:43.311 "num_blocks": 65536, 00:19:43.311 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.311 "assigned_rate_limits": { 00:19:43.311 "rw_ios_per_sec": 0, 00:19:43.311 "rw_mbytes_per_sec": 0, 00:19:43.311 "r_mbytes_per_sec": 0, 00:19:43.311 "w_mbytes_per_sec": 0 00:19:43.311 }, 00:19:43.311 "claimed": true, 00:19:43.311 "claim_type": "exclusive_write", 00:19:43.311 "zoned": false, 00:19:43.311 "supported_io_types": { 00:19:43.311 "read": true, 00:19:43.311 "write": true, 00:19:43.311 "unmap": true, 00:19:43.311 "write_zeroes": true, 00:19:43.311 "flush": true, 00:19:43.311 "reset": true, 00:19:43.311 "compare": false, 00:19:43.311 "compare_and_write": false, 00:19:43.311 "abort": true, 00:19:43.311 "nvme_admin": false, 00:19:43.311 "nvme_io": false 00:19:43.311 }, 00:19:43.311 "memory_domains": [ 00:19:43.311 { 00:19:43.311 "dma_device_id": "system", 00:19:43.311 "dma_device_type": 1 00:19:43.311 }, 00:19:43.311 { 00:19:43.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.311 "dma_device_type": 2 00:19:43.311 } 00:19:43.311 ], 00:19:43.311 "driver_specific": { 00:19:43.311 "passthru": { 00:19:43.311 "name": "pt2", 00:19:43.311 "base_bdev_name": "malloc2" 00:19:43.311 } 00:19:43.311 } 00:19:43.311 }' 00:19:43.311 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.571 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.831 "name": "pt3", 00:19:43.831 "aliases": [ 00:19:43.831 "00000000-0000-0000-0000-000000000003" 00:19:43.831 ], 00:19:43.831 "product_name": "passthru", 00:19:43.831 "block_size": 512, 00:19:43.831 "num_blocks": 65536, 00:19:43.831 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:43.831 "assigned_rate_limits": { 00:19:43.831 "rw_ios_per_sec": 0, 00:19:43.831 "rw_mbytes_per_sec": 0, 00:19:43.831 "r_mbytes_per_sec": 0, 00:19:43.831 "w_mbytes_per_sec": 0 00:19:43.831 }, 00:19:43.831 "claimed": true, 00:19:43.831 "claim_type": "exclusive_write", 00:19:43.831 "zoned": false, 00:19:43.831 "supported_io_types": { 00:19:43.831 "read": true, 00:19:43.831 "write": true, 00:19:43.831 "unmap": true, 00:19:43.831 "write_zeroes": true, 00:19:43.831 "flush": true, 00:19:43.831 "reset": true, 00:19:43.831 "compare": false, 00:19:43.831 "compare_and_write": false, 00:19:43.831 "abort": true, 00:19:43.831 "nvme_admin": false, 00:19:43.831 "nvme_io": false 00:19:43.831 }, 00:19:43.831 "memory_domains": [ 00:19:43.831 { 00:19:43.831 "dma_device_id": "system", 00:19:43.831 "dma_device_type": 1 00:19:43.831 }, 00:19:43.831 { 00:19:43.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.831 "dma_device_type": 2 00:19:43.831 } 00:19:43.831 ], 00:19:43.831 "driver_specific": { 00:19:43.831 "passthru": { 00:19:43.831 "name": "pt3", 00:19:43.831 "base_bdev_name": "malloc3" 00:19:43.831 } 00:19:43.831 } 00:19:43.831 }' 00:19:43.831 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.090 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.351 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.351 10:15:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.351 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.351 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.351 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.351 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:44.351 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.611 "name": "pt4", 00:19:44.611 "aliases": [ 00:19:44.611 "00000000-0000-0000-0000-000000000004" 00:19:44.611 ], 00:19:44.611 "product_name": "passthru", 00:19:44.611 "block_size": 512, 00:19:44.611 "num_blocks": 65536, 00:19:44.611 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:44.611 "assigned_rate_limits": { 00:19:44.611 "rw_ios_per_sec": 0, 00:19:44.611 "rw_mbytes_per_sec": 0, 00:19:44.611 "r_mbytes_per_sec": 0, 00:19:44.611 "w_mbytes_per_sec": 0 00:19:44.611 }, 00:19:44.611 "claimed": true, 00:19:44.611 "claim_type": "exclusive_write", 00:19:44.611 "zoned": false, 00:19:44.611 "supported_io_types": { 00:19:44.611 "read": true, 00:19:44.611 "write": true, 00:19:44.611 "unmap": true, 00:19:44.611 "write_zeroes": true, 00:19:44.611 "flush": true, 00:19:44.611 "reset": true, 00:19:44.611 "compare": false, 00:19:44.611 "compare_and_write": false, 00:19:44.611 "abort": true, 00:19:44.611 "nvme_admin": false, 00:19:44.611 "nvme_io": false 00:19:44.611 }, 00:19:44.611 "memory_domains": [ 00:19:44.611 { 00:19:44.611 "dma_device_id": "system", 00:19:44.611 "dma_device_type": 1 00:19:44.611 }, 00:19:44.611 { 00:19:44.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.611 "dma_device_type": 2 00:19:44.611 } 00:19:44.611 ], 00:19:44.611 "driver_specific": { 00:19:44.611 "passthru": { 00:19:44.611 "name": "pt4", 00:19:44.611 "base_bdev_name": "malloc4" 00:19:44.611 } 00:19:44.611 } 00:19:44.611 }' 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.611 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:44.871 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:45.131 [2024-06-10 10:15:06.754668] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.131 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 84bf5243-56ff-4c1a-a053-fe75f8fc8f52 '!=' 84bf5243-56ff-4c1a-a053-fe75f8fc8f52 ']' 00:19:45.131 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:19:45.131 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:45.131 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:45.131 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:45.131 [2024-06-10 10:15:06.946959] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.132 10:15:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.392 10:15:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.392 "name": "raid_bdev1", 00:19:45.392 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:45.392 "strip_size_kb": 0, 00:19:45.392 "state": "online", 00:19:45.392 "raid_level": "raid1", 00:19:45.392 "superblock": true, 00:19:45.392 "num_base_bdevs": 4, 00:19:45.392 "num_base_bdevs_discovered": 3, 00:19:45.392 "num_base_bdevs_operational": 3, 00:19:45.392 "base_bdevs_list": [ 00:19:45.392 { 00:19:45.392 "name": null, 00:19:45.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.392 "is_configured": false, 00:19:45.392 "data_offset": 2048, 00:19:45.392 "data_size": 63488 00:19:45.392 }, 00:19:45.392 { 00:19:45.392 "name": "pt2", 00:19:45.392 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:45.392 "is_configured": true, 00:19:45.392 "data_offset": 2048, 00:19:45.392 "data_size": 63488 00:19:45.392 }, 00:19:45.392 { 00:19:45.392 "name": "pt3", 00:19:45.392 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:45.392 "is_configured": true, 00:19:45.392 "data_offset": 2048, 00:19:45.392 "data_size": 63488 00:19:45.392 }, 00:19:45.392 { 00:19:45.392 "name": "pt4", 00:19:45.392 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:45.392 "is_configured": true, 00:19:45.392 "data_offset": 2048, 00:19:45.392 "data_size": 63488 00:19:45.392 } 00:19:45.392 ] 00:19:45.392 }' 00:19:45.392 10:15:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.392 10:15:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.961 10:15:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:46.236 [2024-06-10 10:15:07.861256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:46.236 [2024-06-10 10:15:07.861271] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:46.236 [2024-06-10 10:15:07.861305] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:46.236 [2024-06-10 10:15:07.861354] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:46.237 [2024-06-10 10:15:07.861359] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bec7d0 name raid_bdev1, state offline 00:19:46.237 10:15:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.237 10:15:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:19:46.237 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:19:46.237 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:19:46.237 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:19:46.237 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:46.237 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:46.509 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:46.509 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:46.509 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:46.769 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:46.769 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:46.769 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:47.030 [2024-06-10 10:15:08.811616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:47.030 [2024-06-10 10:15:08.811646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.030 [2024-06-10 10:15:08.811657] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be5900 00:19:47.030 [2024-06-10 10:15:08.811663] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.030 [2024-06-10 10:15:08.812942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.030 [2024-06-10 10:15:08.812963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:47.030 [2024-06-10 10:15:08.813008] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:47.030 [2024-06-10 10:15:08.813027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:47.030 pt2 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.030 10:15:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.290 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.290 "name": "raid_bdev1", 00:19:47.290 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:47.290 "strip_size_kb": 0, 00:19:47.290 "state": "configuring", 00:19:47.290 "raid_level": "raid1", 00:19:47.290 "superblock": true, 00:19:47.290 "num_base_bdevs": 4, 00:19:47.290 "num_base_bdevs_discovered": 1, 00:19:47.290 "num_base_bdevs_operational": 3, 00:19:47.290 "base_bdevs_list": [ 00:19:47.290 { 00:19:47.290 "name": null, 00:19:47.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.290 "is_configured": false, 00:19:47.290 "data_offset": 2048, 00:19:47.290 "data_size": 63488 00:19:47.290 }, 00:19:47.290 { 00:19:47.290 "name": "pt2", 00:19:47.290 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:47.290 "is_configured": true, 00:19:47.290 "data_offset": 2048, 00:19:47.290 "data_size": 63488 00:19:47.290 }, 00:19:47.290 { 00:19:47.290 "name": null, 00:19:47.290 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:47.290 "is_configured": false, 00:19:47.290 "data_offset": 2048, 00:19:47.290 "data_size": 63488 00:19:47.290 }, 00:19:47.290 { 00:19:47.290 "name": null, 00:19:47.290 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:47.290 "is_configured": false, 00:19:47.290 "data_offset": 2048, 00:19:47.290 "data_size": 63488 00:19:47.290 } 00:19:47.290 ] 00:19:47.290 }' 00:19:47.290 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.290 10:15:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.861 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:47.861 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:47.861 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:47.861 [2024-06-10 10:15:09.717910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:47.861 [2024-06-10 10:15:09.717941] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.861 [2024-06-10 10:15:09.717951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1beca50 00:19:47.861 [2024-06-10 10:15:09.717957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.861 [2024-06-10 10:15:09.718216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.861 [2024-06-10 10:15:09.718226] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:47.861 [2024-06-10 10:15:09.718268] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:47.861 [2024-06-10 10:15:09.718280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:47.861 pt3 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.121 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.122 "name": "raid_bdev1", 00:19:48.122 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:48.122 "strip_size_kb": 0, 00:19:48.122 "state": "configuring", 00:19:48.122 "raid_level": "raid1", 00:19:48.122 "superblock": true, 00:19:48.122 "num_base_bdevs": 4, 00:19:48.122 "num_base_bdevs_discovered": 2, 00:19:48.122 "num_base_bdevs_operational": 3, 00:19:48.122 "base_bdevs_list": [ 00:19:48.122 { 00:19:48.122 "name": null, 00:19:48.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.122 "is_configured": false, 00:19:48.122 "data_offset": 2048, 00:19:48.122 "data_size": 63488 00:19:48.122 }, 00:19:48.122 { 00:19:48.122 "name": "pt2", 00:19:48.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:48.122 "is_configured": true, 00:19:48.122 "data_offset": 2048, 00:19:48.122 "data_size": 63488 00:19:48.122 }, 00:19:48.122 { 00:19:48.122 "name": "pt3", 00:19:48.122 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:48.122 "is_configured": true, 00:19:48.122 "data_offset": 2048, 00:19:48.122 "data_size": 63488 00:19:48.122 }, 00:19:48.122 { 00:19:48.122 "name": null, 00:19:48.122 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:48.122 "is_configured": false, 00:19:48.122 "data_offset": 2048, 00:19:48.122 "data_size": 63488 00:19:48.122 } 00:19:48.122 ] 00:19:48.122 }' 00:19:48.122 10:15:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.122 10:15:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.692 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:48.692 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:48.692 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:19:48.692 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:48.952 [2024-06-10 10:15:10.660303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:48.952 [2024-06-10 10:15:10.660338] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.952 [2024-06-10 10:15:10.660349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3bc90 00:19:48.952 [2024-06-10 10:15:10.660355] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.952 [2024-06-10 10:15:10.660615] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.952 [2024-06-10 10:15:10.660625] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:48.952 [2024-06-10 10:15:10.660667] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:48.952 [2024-06-10 10:15:10.660678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:48.952 [2024-06-10 10:15:10.660763] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3b360 00:19:48.952 [2024-06-10 10:15:10.660768] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:48.952 [2024-06-10 10:15:10.660908] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3b830 00:19:48.953 [2024-06-10 10:15:10.661013] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3b360 00:19:48.953 [2024-06-10 10:15:10.661019] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a3b360 00:19:48.953 [2024-06-10 10:15:10.661092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:48.953 pt4 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.953 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.213 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.213 "name": "raid_bdev1", 00:19:49.213 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:49.213 "strip_size_kb": 0, 00:19:49.213 "state": "online", 00:19:49.213 "raid_level": "raid1", 00:19:49.213 "superblock": true, 00:19:49.213 "num_base_bdevs": 4, 00:19:49.213 "num_base_bdevs_discovered": 3, 00:19:49.213 "num_base_bdevs_operational": 3, 00:19:49.213 "base_bdevs_list": [ 00:19:49.213 { 00:19:49.213 "name": null, 00:19:49.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.213 "is_configured": false, 00:19:49.213 "data_offset": 2048, 00:19:49.213 "data_size": 63488 00:19:49.213 }, 00:19:49.213 { 00:19:49.213 "name": "pt2", 00:19:49.213 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:49.213 "is_configured": true, 00:19:49.213 "data_offset": 2048, 00:19:49.213 "data_size": 63488 00:19:49.213 }, 00:19:49.213 { 00:19:49.213 "name": "pt3", 00:19:49.213 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:49.213 "is_configured": true, 00:19:49.213 "data_offset": 2048, 00:19:49.213 "data_size": 63488 00:19:49.213 }, 00:19:49.213 { 00:19:49.213 "name": "pt4", 00:19:49.213 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:49.213 "is_configured": true, 00:19:49.213 "data_offset": 2048, 00:19:49.214 "data_size": 63488 00:19:49.214 } 00:19:49.214 ] 00:19:49.214 }' 00:19:49.214 10:15:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.214 10:15:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.784 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:49.784 [2024-06-10 10:15:11.530482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:49.784 [2024-06-10 10:15:11.530497] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:49.784 [2024-06-10 10:15:11.530530] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:49.784 [2024-06-10 10:15:11.530577] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:49.784 [2024-06-10 10:15:11.530583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3b360 name raid_bdev1, state offline 00:19:49.784 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.784 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:50.044 10:15:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:50.304 [2024-06-10 10:15:12.083871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:50.304 [2024-06-10 10:15:12.083895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.304 [2024-06-10 10:15:12.083905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bdd770 00:19:50.304 [2024-06-10 10:15:12.083911] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.304 [2024-06-10 10:15:12.085179] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.304 [2024-06-10 10:15:12.085199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:50.304 [2024-06-10 10:15:12.085245] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:50.304 [2024-06-10 10:15:12.085263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:50.304 [2024-06-10 10:15:12.085335] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:50.304 [2024-06-10 10:15:12.085342] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:50.304 [2024-06-10 10:15:12.085351] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bed140 name raid_bdev1, state configuring 00:19:50.304 [2024-06-10 10:15:12.085366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:50.304 [2024-06-10 10:15:12.085422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:50.304 pt1 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.304 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.564 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.564 "name": "raid_bdev1", 00:19:50.564 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:50.564 "strip_size_kb": 0, 00:19:50.564 "state": "configuring", 00:19:50.564 "raid_level": "raid1", 00:19:50.564 "superblock": true, 00:19:50.564 "num_base_bdevs": 4, 00:19:50.564 "num_base_bdevs_discovered": 2, 00:19:50.564 "num_base_bdevs_operational": 3, 00:19:50.564 "base_bdevs_list": [ 00:19:50.564 { 00:19:50.564 "name": null, 00:19:50.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.564 "is_configured": false, 00:19:50.564 "data_offset": 2048, 00:19:50.564 "data_size": 63488 00:19:50.564 }, 00:19:50.564 { 00:19:50.564 "name": "pt2", 00:19:50.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.564 "is_configured": true, 00:19:50.564 "data_offset": 2048, 00:19:50.564 "data_size": 63488 00:19:50.564 }, 00:19:50.564 { 00:19:50.564 "name": "pt3", 00:19:50.564 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.564 "is_configured": true, 00:19:50.564 "data_offset": 2048, 00:19:50.564 "data_size": 63488 00:19:50.564 }, 00:19:50.564 { 00:19:50.564 "name": null, 00:19:50.564 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:50.564 "is_configured": false, 00:19:50.564 "data_offset": 2048, 00:19:50.564 "data_size": 63488 00:19:50.564 } 00:19:50.564 ] 00:19:50.564 }' 00:19:50.564 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.564 10:15:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.135 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:19:51.135 10:15:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:51.395 [2024-06-10 10:15:13.206710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:51.395 [2024-06-10 10:15:13.206746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.395 [2024-06-10 10:15:13.206757] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be9880 00:19:51.395 [2024-06-10 10:15:13.206763] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.395 [2024-06-10 10:15:13.207041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.395 [2024-06-10 10:15:13.207052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:51.395 [2024-06-10 10:15:13.207097] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:51.395 [2024-06-10 10:15:13.207109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:51.395 [2024-06-10 10:15:13.207194] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1be6d20 00:19:51.395 [2024-06-10 10:15:13.207200] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:51.395 [2024-06-10 10:15:13.207335] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1708d30 00:19:51.395 [2024-06-10 10:15:13.207443] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1be6d20 00:19:51.395 [2024-06-10 10:15:13.207448] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1be6d20 00:19:51.395 [2024-06-10 10:15:13.207520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.395 pt4 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.395 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.654 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.654 "name": "raid_bdev1", 00:19:51.654 "uuid": "84bf5243-56ff-4c1a-a053-fe75f8fc8f52", 00:19:51.654 "strip_size_kb": 0, 00:19:51.654 "state": "online", 00:19:51.654 "raid_level": "raid1", 00:19:51.654 "superblock": true, 00:19:51.654 "num_base_bdevs": 4, 00:19:51.654 "num_base_bdevs_discovered": 3, 00:19:51.654 "num_base_bdevs_operational": 3, 00:19:51.654 "base_bdevs_list": [ 00:19:51.654 { 00:19:51.654 "name": null, 00:19:51.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.654 "is_configured": false, 00:19:51.654 "data_offset": 2048, 00:19:51.654 "data_size": 63488 00:19:51.654 }, 00:19:51.654 { 00:19:51.654 "name": "pt2", 00:19:51.654 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:51.654 "is_configured": true, 00:19:51.654 "data_offset": 2048, 00:19:51.654 "data_size": 63488 00:19:51.654 }, 00:19:51.654 { 00:19:51.654 "name": "pt3", 00:19:51.654 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:51.654 "is_configured": true, 00:19:51.654 "data_offset": 2048, 00:19:51.654 "data_size": 63488 00:19:51.654 }, 00:19:51.654 { 00:19:51.654 "name": "pt4", 00:19:51.654 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:51.654 "is_configured": true, 00:19:51.654 "data_offset": 2048, 00:19:51.654 "data_size": 63488 00:19:51.654 } 00:19:51.654 ] 00:19:51.654 }' 00:19:51.654 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.654 10:15:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.223 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:19:52.223 10:15:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:52.483 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:19:52.483 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:19:52.484 [2024-06-10 10:15:14.293646] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 84bf5243-56ff-4c1a-a053-fe75f8fc8f52 '!=' 84bf5243-56ff-4c1a-a053-fe75f8fc8f52 ']' 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1062651 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1062651 ']' 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1062651 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:52.484 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1062651 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1062651' 00:19:52.744 killing process with pid 1062651 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1062651 00:19:52.744 [2024-06-10 10:15:14.363439] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:52.744 [2024-06-10 10:15:14.363479] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:52.744 [2024-06-10 10:15:14.363530] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:52.744 [2024-06-10 10:15:14.363536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1be6d20 name raid_bdev1, state offline 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1062651 00:19:52.744 [2024-06-10 10:15:14.384015] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:52.744 00:19:52.744 real 0m21.179s 00:19:52.744 user 0m39.645s 00:19:52.744 sys 0m3.066s 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:52.744 10:15:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.744 ************************************ 00:19:52.744 END TEST raid_superblock_test 00:19:52.744 ************************************ 00:19:52.744 10:15:14 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:19:52.744 10:15:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:52.744 10:15:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:52.744 10:15:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:52.744 ************************************ 00:19:52.744 START TEST raid_read_error_test 00:19:52.744 ************************************ 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 read 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OIEYrQH93F 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1067342 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1067342 /var/tmp/spdk-raid.sock 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1067342 ']' 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:52.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:52.745 10:15:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.005 [2024-06-10 10:15:14.653567] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:19:53.005 [2024-06-10 10:15:14.653614] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067342 ] 00:19:53.005 [2024-06-10 10:15:14.739948] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:53.005 [2024-06-10 10:15:14.804607] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:53.005 [2024-06-10 10:15:14.843902] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:53.005 [2024-06-10 10:15:14.843924] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:53.945 10:15:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:53.945 10:15:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:19:53.945 10:15:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:53.945 10:15:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:53.945 BaseBdev1_malloc 00:19:53.945 10:15:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:54.205 true 00:19:54.205 10:15:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:54.205 [2024-06-10 10:15:16.026269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:54.205 [2024-06-10 10:15:16.026302] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:54.205 [2024-06-10 10:15:16.026314] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28e1d10 00:19:54.205 [2024-06-10 10:15:16.026325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:54.205 [2024-06-10 10:15:16.027736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:54.205 [2024-06-10 10:15:16.027755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:54.205 BaseBdev1 00:19:54.205 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:54.205 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:54.465 BaseBdev2_malloc 00:19:54.465 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:54.725 true 00:19:54.725 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:54.986 [2024-06-10 10:15:16.593607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:54.986 [2024-06-10 10:15:16.593636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:54.986 [2024-06-10 10:15:16.593647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28e6710 00:19:54.986 [2024-06-10 10:15:16.593653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:54.986 [2024-06-10 10:15:16.594897] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:54.986 [2024-06-10 10:15:16.594915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:54.986 BaseBdev2 00:19:54.986 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:54.986 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:54.986 BaseBdev3_malloc 00:19:54.986 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:55.245 true 00:19:55.245 10:15:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:55.505 [2024-06-10 10:15:17.160863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:55.505 [2024-06-10 10:15:17.160890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.505 [2024-06-10 10:15:17.160900] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28e7340 00:19:55.505 [2024-06-10 10:15:17.160906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.505 [2024-06-10 10:15:17.162144] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.505 [2024-06-10 10:15:17.162162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:55.505 BaseBdev3 00:19:55.505 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:55.505 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:55.505 BaseBdev4_malloc 00:19:55.505 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:55.765 true 00:19:55.765 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:56.025 [2024-06-10 10:15:17.732204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:56.025 [2024-06-10 10:15:17.732233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.025 [2024-06-10 10:15:17.732250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28e0aa0 00:19:56.025 [2024-06-10 10:15:17.732257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.025 [2024-06-10 10:15:17.733502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.025 [2024-06-10 10:15:17.733521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:56.025 BaseBdev4 00:19:56.025 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:56.285 [2024-06-10 10:15:17.920700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:56.285 [2024-06-10 10:15:17.921718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.285 [2024-06-10 10:15:17.921771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:56.285 [2024-06-10 10:15:17.921828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:56.285 [2024-06-10 10:15:17.922009] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28ea1b0 00:19:56.285 [2024-06-10 10:15:17.922016] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:56.285 [2024-06-10 10:15:17.922162] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27310d0 00:19:56.285 [2024-06-10 10:15:17.922283] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28ea1b0 00:19:56.285 [2024-06-10 10:15:17.922288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28ea1b0 00:19:56.285 [2024-06-10 10:15:17.922363] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.285 10:15:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.285 10:15:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.285 "name": "raid_bdev1", 00:19:56.285 "uuid": "ca246ab4-6d70-4f96-9915-c9f6db080e62", 00:19:56.285 "strip_size_kb": 0, 00:19:56.285 "state": "online", 00:19:56.285 "raid_level": "raid1", 00:19:56.285 "superblock": true, 00:19:56.285 "num_base_bdevs": 4, 00:19:56.285 "num_base_bdevs_discovered": 4, 00:19:56.285 "num_base_bdevs_operational": 4, 00:19:56.285 "base_bdevs_list": [ 00:19:56.285 { 00:19:56.285 "name": "BaseBdev1", 00:19:56.285 "uuid": "8c35c8a1-674a-5b6a-8194-c19967fe7913", 00:19:56.285 "is_configured": true, 00:19:56.285 "data_offset": 2048, 00:19:56.285 "data_size": 63488 00:19:56.285 }, 00:19:56.285 { 00:19:56.285 "name": "BaseBdev2", 00:19:56.285 "uuid": "0d46c640-7536-5f91-bc72-937f923cf384", 00:19:56.285 "is_configured": true, 00:19:56.285 "data_offset": 2048, 00:19:56.285 "data_size": 63488 00:19:56.285 }, 00:19:56.285 { 00:19:56.285 "name": "BaseBdev3", 00:19:56.285 "uuid": "89e60a0d-a1de-51eb-b043-e699b245e8c2", 00:19:56.285 "is_configured": true, 00:19:56.285 "data_offset": 2048, 00:19:56.285 "data_size": 63488 00:19:56.285 }, 00:19:56.285 { 00:19:56.285 "name": "BaseBdev4", 00:19:56.285 "uuid": "e890533b-881e-5cce-89a3-55f0da8a3eef", 00:19:56.285 "is_configured": true, 00:19:56.285 "data_offset": 2048, 00:19:56.285 "data_size": 63488 00:19:56.285 } 00:19:56.285 ] 00:19:56.285 }' 00:19:56.285 10:15:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.285 10:15:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.855 10:15:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:56.855 10:15:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:57.115 [2024-06-10 10:15:18.746963] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2731010 00:19:58.058 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:58.058 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.059 10:15:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.318 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.318 "name": "raid_bdev1", 00:19:58.318 "uuid": "ca246ab4-6d70-4f96-9915-c9f6db080e62", 00:19:58.318 "strip_size_kb": 0, 00:19:58.318 "state": "online", 00:19:58.318 "raid_level": "raid1", 00:19:58.318 "superblock": true, 00:19:58.318 "num_base_bdevs": 4, 00:19:58.318 "num_base_bdevs_discovered": 4, 00:19:58.318 "num_base_bdevs_operational": 4, 00:19:58.318 "base_bdevs_list": [ 00:19:58.318 { 00:19:58.318 "name": "BaseBdev1", 00:19:58.318 "uuid": "8c35c8a1-674a-5b6a-8194-c19967fe7913", 00:19:58.318 "is_configured": true, 00:19:58.318 "data_offset": 2048, 00:19:58.318 "data_size": 63488 00:19:58.318 }, 00:19:58.318 { 00:19:58.318 "name": "BaseBdev2", 00:19:58.318 "uuid": "0d46c640-7536-5f91-bc72-937f923cf384", 00:19:58.318 "is_configured": true, 00:19:58.318 "data_offset": 2048, 00:19:58.318 "data_size": 63488 00:19:58.318 }, 00:19:58.318 { 00:19:58.318 "name": "BaseBdev3", 00:19:58.318 "uuid": "89e60a0d-a1de-51eb-b043-e699b245e8c2", 00:19:58.318 "is_configured": true, 00:19:58.318 "data_offset": 2048, 00:19:58.318 "data_size": 63488 00:19:58.318 }, 00:19:58.318 { 00:19:58.318 "name": "BaseBdev4", 00:19:58.318 "uuid": "e890533b-881e-5cce-89a3-55f0da8a3eef", 00:19:58.318 "is_configured": true, 00:19:58.318 "data_offset": 2048, 00:19:58.318 "data_size": 63488 00:19:58.318 } 00:19:58.318 ] 00:19:58.318 }' 00:19:58.318 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.318 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:58.886 [2024-06-10 10:15:20.705497] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:58.886 [2024-06-10 10:15:20.705536] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:58.886 [2024-06-10 10:15:20.708112] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:58.886 [2024-06-10 10:15:20.708140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.886 [2024-06-10 10:15:20.708240] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:58.886 [2024-06-10 10:15:20.708247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28ea1b0 name raid_bdev1, state offline 00:19:58.886 0 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1067342 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1067342 ']' 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1067342 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:58.886 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1067342 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1067342' 00:19:59.147 killing process with pid 1067342 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1067342 00:19:59.147 [2024-06-10 10:15:20.773382] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1067342 00:19:59.147 [2024-06-10 10:15:20.790415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OIEYrQH93F 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:59.147 00:19:59.147 real 0m6.338s 00:19:59.147 user 0m10.158s 00:19:59.147 sys 0m0.883s 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:59.147 10:15:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.147 ************************************ 00:19:59.147 END TEST raid_read_error_test 00:19:59.147 ************************************ 00:19:59.147 10:15:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:19:59.147 10:15:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:59.147 10:15:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:59.147 10:15:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:59.147 ************************************ 00:19:59.147 START TEST raid_write_error_test 00:19:59.147 ************************************ 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 write 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:59.147 10:15:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:59.147 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.W3KGVRKcRv 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1068435 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1068435 /var/tmp/spdk-raid.sock 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1068435 ']' 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:59.407 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:59.407 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.407 [2024-06-10 10:15:21.067172] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:19:59.408 [2024-06-10 10:15:21.067227] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1068435 ] 00:19:59.408 [2024-06-10 10:15:21.153486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:59.408 [2024-06-10 10:15:21.225236] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:59.668 [2024-06-10 10:15:21.276335] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:59.668 [2024-06-10 10:15:21.276358] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:00.237 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:00.237 10:15:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:20:00.237 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:00.237 10:15:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:00.237 BaseBdev1_malloc 00:20:00.237 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:00.495 true 00:20:00.495 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:00.495 [2024-06-10 10:15:22.347376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:00.495 [2024-06-10 10:15:22.347408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.495 [2024-06-10 10:15:22.347418] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0dd10 00:20:00.495 [2024-06-10 10:15:22.347425] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.495 [2024-06-10 10:15:22.348730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.495 [2024-06-10 10:15:22.348750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:00.495 BaseBdev1 00:20:00.754 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:00.754 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:00.754 BaseBdev2_malloc 00:20:00.754 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:01.014 true 00:20:01.014 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:01.014 [2024-06-10 10:15:22.790046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:01.014 [2024-06-10 10:15:22.790071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.014 [2024-06-10 10:15:22.790081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c12710 00:20:01.014 [2024-06-10 10:15:22.790087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.014 [2024-06-10 10:15:22.791217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.014 [2024-06-10 10:15:22.791234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:01.014 BaseBdev2 00:20:01.014 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:01.014 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:01.274 BaseBdev3_malloc 00:20:01.274 10:15:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:01.274 true 00:20:01.274 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:01.533 [2024-06-10 10:15:23.232681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:01.533 [2024-06-10 10:15:23.232703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.533 [2024-06-10 10:15:23.232712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c13340 00:20:01.533 [2024-06-10 10:15:23.232718] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.533 [2024-06-10 10:15:23.233858] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.533 [2024-06-10 10:15:23.233876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:01.533 BaseBdev3 00:20:01.533 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:01.533 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:01.533 BaseBdev4_malloc 00:20:01.793 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:01.793 true 00:20:01.793 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:02.054 [2024-06-10 10:15:23.679532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:02.054 [2024-06-10 10:15:23.679557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.054 [2024-06-10 10:15:23.679569] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0caa0 00:20:02.054 [2024-06-10 10:15:23.679575] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.054 [2024-06-10 10:15:23.680723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.054 [2024-06-10 10:15:23.680741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:02.054 BaseBdev4 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:02.054 [2024-06-10 10:15:23.827928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:02.054 [2024-06-10 10:15:23.828902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:02.054 [2024-06-10 10:15:23.828952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:02.054 [2024-06-10 10:15:23.829000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:02.054 [2024-06-10 10:15:23.829177] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c161b0 00:20:02.054 [2024-06-10 10:15:23.829185] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:02.054 [2024-06-10 10:15:23.829323] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a5d0d0 00:20:02.054 [2024-06-10 10:15:23.829441] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c161b0 00:20:02.054 [2024-06-10 10:15:23.829446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c161b0 00:20:02.054 [2024-06-10 10:15:23.829519] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.054 10:15:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.315 10:15:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.315 "name": "raid_bdev1", 00:20:02.315 "uuid": "8517fa41-0125-4a1e-bd97-9fa124a93bde", 00:20:02.315 "strip_size_kb": 0, 00:20:02.315 "state": "online", 00:20:02.315 "raid_level": "raid1", 00:20:02.315 "superblock": true, 00:20:02.315 "num_base_bdevs": 4, 00:20:02.315 "num_base_bdevs_discovered": 4, 00:20:02.315 "num_base_bdevs_operational": 4, 00:20:02.315 "base_bdevs_list": [ 00:20:02.315 { 00:20:02.315 "name": "BaseBdev1", 00:20:02.315 "uuid": "47d0ed00-1288-5c00-a7d5-f717ef0d84e6", 00:20:02.315 "is_configured": true, 00:20:02.315 "data_offset": 2048, 00:20:02.315 "data_size": 63488 00:20:02.315 }, 00:20:02.315 { 00:20:02.315 "name": "BaseBdev2", 00:20:02.315 "uuid": "0f8201b0-fe05-55fb-9493-175252090eb1", 00:20:02.315 "is_configured": true, 00:20:02.315 "data_offset": 2048, 00:20:02.315 "data_size": 63488 00:20:02.315 }, 00:20:02.315 { 00:20:02.315 "name": "BaseBdev3", 00:20:02.315 "uuid": "a50fbc17-478f-51fe-ac13-3803066fc307", 00:20:02.315 "is_configured": true, 00:20:02.315 "data_offset": 2048, 00:20:02.315 "data_size": 63488 00:20:02.315 }, 00:20:02.315 { 00:20:02.315 "name": "BaseBdev4", 00:20:02.315 "uuid": "26f3ce81-129c-520c-80e6-7e9827b995f8", 00:20:02.315 "is_configured": true, 00:20:02.315 "data_offset": 2048, 00:20:02.315 "data_size": 63488 00:20:02.315 } 00:20:02.315 ] 00:20:02.315 }' 00:20:02.315 10:15:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.315 10:15:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.886 10:15:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:02.886 10:15:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:02.886 [2024-06-10 10:15:24.650187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a5d010 00:20:03.827 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:04.088 [2024-06-10 10:15:25.725238] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:20:04.088 [2024-06-10 10:15:25.725285] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:04.088 [2024-06-10 10:15:25.725478] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a5d010 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:04.088 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.089 "name": "raid_bdev1", 00:20:04.089 "uuid": "8517fa41-0125-4a1e-bd97-9fa124a93bde", 00:20:04.089 "strip_size_kb": 0, 00:20:04.089 "state": "online", 00:20:04.089 "raid_level": "raid1", 00:20:04.089 "superblock": true, 00:20:04.089 "num_base_bdevs": 4, 00:20:04.089 "num_base_bdevs_discovered": 3, 00:20:04.089 "num_base_bdevs_operational": 3, 00:20:04.089 "base_bdevs_list": [ 00:20:04.089 { 00:20:04.089 "name": null, 00:20:04.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.089 "is_configured": false, 00:20:04.089 "data_offset": 2048, 00:20:04.089 "data_size": 63488 00:20:04.089 }, 00:20:04.089 { 00:20:04.089 "name": "BaseBdev2", 00:20:04.089 "uuid": "0f8201b0-fe05-55fb-9493-175252090eb1", 00:20:04.089 "is_configured": true, 00:20:04.089 "data_offset": 2048, 00:20:04.089 "data_size": 63488 00:20:04.089 }, 00:20:04.089 { 00:20:04.089 "name": "BaseBdev3", 00:20:04.089 "uuid": "a50fbc17-478f-51fe-ac13-3803066fc307", 00:20:04.089 "is_configured": true, 00:20:04.089 "data_offset": 2048, 00:20:04.089 "data_size": 63488 00:20:04.089 }, 00:20:04.089 { 00:20:04.089 "name": "BaseBdev4", 00:20:04.089 "uuid": "26f3ce81-129c-520c-80e6-7e9827b995f8", 00:20:04.089 "is_configured": true, 00:20:04.089 "data_offset": 2048, 00:20:04.089 "data_size": 63488 00:20:04.089 } 00:20:04.089 ] 00:20:04.089 }' 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.089 10:15:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.660 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:04.922 [2024-06-10 10:15:26.654479] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:04.922 [2024-06-10 10:15:26.654502] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:04.922 [2024-06-10 10:15:26.657128] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:04.922 [2024-06-10 10:15:26.657153] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.922 [2024-06-10 10:15:26.657228] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:04.922 [2024-06-10 10:15:26.657234] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c161b0 name raid_bdev1, state offline 00:20:04.922 0 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1068435 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1068435 ']' 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1068435 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1068435 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1068435' 00:20:04.922 killing process with pid 1068435 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1068435 00:20:04.922 [2024-06-10 10:15:26.722919] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:04.922 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1068435 00:20:04.922 [2024-06-10 10:15:26.739900] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.W3KGVRKcRv 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:05.183 00:20:05.183 real 0m5.874s 00:20:05.183 user 0m9.259s 00:20:05.183 sys 0m0.839s 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:05.183 10:15:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.183 ************************************ 00:20:05.183 END TEST raid_write_error_test 00:20:05.183 ************************************ 00:20:05.183 10:15:26 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:20:05.183 10:15:26 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:20:05.183 10:15:26 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:20:05.183 10:15:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:05.183 10:15:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:05.183 10:15:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:05.183 ************************************ 00:20:05.183 START TEST raid_rebuild_test 00:20:05.183 ************************************ 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false false true 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:05.183 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1069495 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1069495 /var/tmp/spdk-raid.sock 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 1069495 ']' 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:05.184 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:05.184 10:15:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.184 [2024-06-10 10:15:27.013256] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:20:05.184 [2024-06-10 10:15:27.013311] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1069495 ] 00:20:05.184 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:05.184 Zero copy mechanism will not be used. 00:20:05.444 [2024-06-10 10:15:27.103911] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:05.444 [2024-06-10 10:15:27.172605] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:05.444 [2024-06-10 10:15:27.224166] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:05.444 [2024-06-10 10:15:27.224193] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:06.015 10:15:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:06.015 10:15:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:20:06.015 10:15:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:06.015 10:15:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:06.275 BaseBdev1_malloc 00:20:06.275 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:06.536 [2024-06-10 10:15:28.198151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:06.536 [2024-06-10 10:15:28.198184] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.536 [2024-06-10 10:15:28.198198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111ca50 00:20:06.536 [2024-06-10 10:15:28.198204] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.536 [2024-06-10 10:15:28.199579] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.536 [2024-06-10 10:15:28.199598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:06.536 BaseBdev1 00:20:06.536 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:06.536 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:06.536 BaseBdev2_malloc 00:20:06.536 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:06.796 [2024-06-10 10:15:28.565104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:06.796 [2024-06-10 10:15:28.565132] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.796 [2024-06-10 10:15:28.565144] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111d5a0 00:20:06.796 [2024-06-10 10:15:28.565150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.796 [2024-06-10 10:15:28.566326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.796 [2024-06-10 10:15:28.566344] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:06.796 BaseBdev2 00:20:06.796 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:07.057 spare_malloc 00:20:07.057 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:07.317 spare_delay 00:20:07.317 10:15:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:07.317 [2024-06-10 10:15:29.120439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:07.317 [2024-06-10 10:15:29.120469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.317 [2024-06-10 10:15:29.120479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cb450 00:20:07.317 [2024-06-10 10:15:29.120485] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.317 [2024-06-10 10:15:29.121663] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.317 [2024-06-10 10:15:29.121682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:07.317 spare 00:20:07.317 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:07.581 [2024-06-10 10:15:29.308932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:07.581 [2024-06-10 10:15:29.309954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:07.581 [2024-06-10 10:15:29.310012] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ca960 00:20:07.581 [2024-06-10 10:15:29.310018] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:07.581 [2024-06-10 10:15:29.310173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c7340 00:20:07.581 [2024-06-10 10:15:29.310280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ca960 00:20:07.581 [2024-06-10 10:15:29.310286] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ca960 00:20:07.581 [2024-06-10 10:15:29.310365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.581 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.910 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.910 "name": "raid_bdev1", 00:20:07.910 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:07.910 "strip_size_kb": 0, 00:20:07.910 "state": "online", 00:20:07.910 "raid_level": "raid1", 00:20:07.910 "superblock": false, 00:20:07.910 "num_base_bdevs": 2, 00:20:07.910 "num_base_bdevs_discovered": 2, 00:20:07.910 "num_base_bdevs_operational": 2, 00:20:07.910 "base_bdevs_list": [ 00:20:07.910 { 00:20:07.910 "name": "BaseBdev1", 00:20:07.910 "uuid": "abc66811-9630-57aa-aca0-61a04b1069bb", 00:20:07.910 "is_configured": true, 00:20:07.910 "data_offset": 0, 00:20:07.910 "data_size": 65536 00:20:07.910 }, 00:20:07.910 { 00:20:07.910 "name": "BaseBdev2", 00:20:07.910 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:07.910 "is_configured": true, 00:20:07.910 "data_offset": 0, 00:20:07.910 "data_size": 65536 00:20:07.910 } 00:20:07.910 ] 00:20:07.910 }' 00:20:07.910 10:15:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.910 10:15:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.191 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:08.191 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:08.451 [2024-06-10 10:15:30.207372] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:08.451 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:08.451 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.451 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:08.710 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:08.970 [2024-06-10 10:15:30.616253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ce480 00:20:08.970 /dev/nbd0 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:08.970 1+0 records in 00:20:08.970 1+0 records out 00:20:08.970 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000160328 s, 25.5 MB/s 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:08.970 10:15:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:20:13.174 65536+0 records in 00:20:13.174 65536+0 records out 00:20:13.174 33554432 bytes (34 MB, 32 MiB) copied, 3.75772 s, 8.9 MB/s 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:13.174 [2024-06-10 10:15:34.626490] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:13.174 [2024-06-10 10:15:34.802967] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.174 10:15:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.174 10:15:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.174 "name": "raid_bdev1", 00:20:13.174 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:13.174 "strip_size_kb": 0, 00:20:13.174 "state": "online", 00:20:13.174 "raid_level": "raid1", 00:20:13.174 "superblock": false, 00:20:13.174 "num_base_bdevs": 2, 00:20:13.174 "num_base_bdevs_discovered": 1, 00:20:13.174 "num_base_bdevs_operational": 1, 00:20:13.174 "base_bdevs_list": [ 00:20:13.174 { 00:20:13.174 "name": null, 00:20:13.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.174 "is_configured": false, 00:20:13.174 "data_offset": 0, 00:20:13.174 "data_size": 65536 00:20:13.174 }, 00:20:13.174 { 00:20:13.174 "name": "BaseBdev2", 00:20:13.174 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:13.174 "is_configured": true, 00:20:13.174 "data_offset": 0, 00:20:13.174 "data_size": 65536 00:20:13.174 } 00:20:13.174 ] 00:20:13.174 }' 00:20:13.174 10:15:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.175 10:15:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.745 10:15:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:14.006 [2024-06-10 10:15:35.725301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:14.006 [2024-06-10 10:15:35.728611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12cd650 00:20:14.006 [2024-06-10 10:15:35.730144] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:14.006 10:15:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.947 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.207 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:15.207 "name": "raid_bdev1", 00:20:15.207 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:15.207 "strip_size_kb": 0, 00:20:15.207 "state": "online", 00:20:15.207 "raid_level": "raid1", 00:20:15.207 "superblock": false, 00:20:15.207 "num_base_bdevs": 2, 00:20:15.207 "num_base_bdevs_discovered": 2, 00:20:15.207 "num_base_bdevs_operational": 2, 00:20:15.207 "process": { 00:20:15.207 "type": "rebuild", 00:20:15.207 "target": "spare", 00:20:15.207 "progress": { 00:20:15.207 "blocks": 22528, 00:20:15.207 "percent": 34 00:20:15.207 } 00:20:15.207 }, 00:20:15.207 "base_bdevs_list": [ 00:20:15.207 { 00:20:15.207 "name": "spare", 00:20:15.207 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:15.207 "is_configured": true, 00:20:15.207 "data_offset": 0, 00:20:15.207 "data_size": 65536 00:20:15.207 }, 00:20:15.207 { 00:20:15.207 "name": "BaseBdev2", 00:20:15.207 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:15.207 "is_configured": true, 00:20:15.207 "data_offset": 0, 00:20:15.207 "data_size": 65536 00:20:15.207 } 00:20:15.207 ] 00:20:15.207 }' 00:20:15.207 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:15.207 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:15.207 10:15:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:15.207 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:15.207 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:15.467 [2024-06-10 10:15:37.202641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:15.467 [2024-06-10 10:15:37.239025] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:15.467 [2024-06-10 10:15:37.239058] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:15.467 [2024-06-10 10:15:37.239067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:15.467 [2024-06-10 10:15:37.239072] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.467 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.727 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.727 "name": "raid_bdev1", 00:20:15.727 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:15.728 "strip_size_kb": 0, 00:20:15.728 "state": "online", 00:20:15.728 "raid_level": "raid1", 00:20:15.728 "superblock": false, 00:20:15.728 "num_base_bdevs": 2, 00:20:15.728 "num_base_bdevs_discovered": 1, 00:20:15.728 "num_base_bdevs_operational": 1, 00:20:15.728 "base_bdevs_list": [ 00:20:15.728 { 00:20:15.728 "name": null, 00:20:15.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.728 "is_configured": false, 00:20:15.728 "data_offset": 0, 00:20:15.728 "data_size": 65536 00:20:15.728 }, 00:20:15.728 { 00:20:15.728 "name": "BaseBdev2", 00:20:15.728 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:15.728 "is_configured": true, 00:20:15.728 "data_offset": 0, 00:20:15.728 "data_size": 65536 00:20:15.728 } 00:20:15.728 ] 00:20:15.728 }' 00:20:15.728 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.728 10:15:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.298 10:15:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:16.298 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:16.298 "name": "raid_bdev1", 00:20:16.298 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:16.298 "strip_size_kb": 0, 00:20:16.298 "state": "online", 00:20:16.298 "raid_level": "raid1", 00:20:16.298 "superblock": false, 00:20:16.298 "num_base_bdevs": 2, 00:20:16.298 "num_base_bdevs_discovered": 1, 00:20:16.298 "num_base_bdevs_operational": 1, 00:20:16.298 "base_bdevs_list": [ 00:20:16.298 { 00:20:16.298 "name": null, 00:20:16.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.298 "is_configured": false, 00:20:16.298 "data_offset": 0, 00:20:16.298 "data_size": 65536 00:20:16.298 }, 00:20:16.298 { 00:20:16.298 "name": "BaseBdev2", 00:20:16.298 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:16.298 "is_configured": true, 00:20:16.298 "data_offset": 0, 00:20:16.298 "data_size": 65536 00:20:16.298 } 00:20:16.298 ] 00:20:16.298 }' 00:20:16.298 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:16.559 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:16.559 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:16.559 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:16.559 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:16.559 [2024-06-10 10:15:38.413872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:16.559 [2024-06-10 10:15:38.417240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1253da0 00:20:16.559 [2024-06-10 10:15:38.418375] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:16.819 10:15:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.760 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:17.760 "name": "raid_bdev1", 00:20:17.760 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:17.760 "strip_size_kb": 0, 00:20:17.760 "state": "online", 00:20:17.760 "raid_level": "raid1", 00:20:17.760 "superblock": false, 00:20:17.760 "num_base_bdevs": 2, 00:20:17.760 "num_base_bdevs_discovered": 2, 00:20:17.760 "num_base_bdevs_operational": 2, 00:20:17.760 "process": { 00:20:17.760 "type": "rebuild", 00:20:17.760 "target": "spare", 00:20:17.760 "progress": { 00:20:17.760 "blocks": 22528, 00:20:17.760 "percent": 34 00:20:17.760 } 00:20:17.760 }, 00:20:17.760 "base_bdevs_list": [ 00:20:17.760 { 00:20:17.760 "name": "spare", 00:20:17.760 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:17.760 "is_configured": true, 00:20:17.760 "data_offset": 0, 00:20:17.760 "data_size": 65536 00:20:17.760 }, 00:20:17.760 { 00:20:17.760 "name": "BaseBdev2", 00:20:17.760 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:17.760 "is_configured": true, 00:20:17.760 "data_offset": 0, 00:20:17.760 "data_size": 65536 00:20:17.760 } 00:20:17.760 ] 00:20:17.760 }' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=631 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.021 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.281 "name": "raid_bdev1", 00:20:18.281 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:18.281 "strip_size_kb": 0, 00:20:18.281 "state": "online", 00:20:18.281 "raid_level": "raid1", 00:20:18.281 "superblock": false, 00:20:18.281 "num_base_bdevs": 2, 00:20:18.281 "num_base_bdevs_discovered": 2, 00:20:18.281 "num_base_bdevs_operational": 2, 00:20:18.281 "process": { 00:20:18.281 "type": "rebuild", 00:20:18.281 "target": "spare", 00:20:18.281 "progress": { 00:20:18.281 "blocks": 28672, 00:20:18.281 "percent": 43 00:20:18.281 } 00:20:18.281 }, 00:20:18.281 "base_bdevs_list": [ 00:20:18.281 { 00:20:18.281 "name": "spare", 00:20:18.281 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:18.281 "is_configured": true, 00:20:18.281 "data_offset": 0, 00:20:18.281 "data_size": 65536 00:20:18.281 }, 00:20:18.281 { 00:20:18.281 "name": "BaseBdev2", 00:20:18.281 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:18.281 "is_configured": true, 00:20:18.281 "data_offset": 0, 00:20:18.281 "data_size": 65536 00:20:18.281 } 00:20:18.281 ] 00:20:18.281 }' 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:18.281 10:15:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.223 10:15:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:19.484 "name": "raid_bdev1", 00:20:19.484 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:19.484 "strip_size_kb": 0, 00:20:19.484 "state": "online", 00:20:19.484 "raid_level": "raid1", 00:20:19.484 "superblock": false, 00:20:19.484 "num_base_bdevs": 2, 00:20:19.484 "num_base_bdevs_discovered": 2, 00:20:19.484 "num_base_bdevs_operational": 2, 00:20:19.484 "process": { 00:20:19.484 "type": "rebuild", 00:20:19.484 "target": "spare", 00:20:19.484 "progress": { 00:20:19.484 "blocks": 55296, 00:20:19.484 "percent": 84 00:20:19.484 } 00:20:19.484 }, 00:20:19.484 "base_bdevs_list": [ 00:20:19.484 { 00:20:19.484 "name": "spare", 00:20:19.484 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:19.484 "is_configured": true, 00:20:19.484 "data_offset": 0, 00:20:19.484 "data_size": 65536 00:20:19.484 }, 00:20:19.484 { 00:20:19.484 "name": "BaseBdev2", 00:20:19.484 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:19.484 "is_configured": true, 00:20:19.484 "data_offset": 0, 00:20:19.484 "data_size": 65536 00:20:19.484 } 00:20:19.484 ] 00:20:19.484 }' 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:19.484 10:15:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:20.055 [2024-06-10 10:15:41.637076] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:20.055 [2024-06-10 10:15:41.637122] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:20.055 [2024-06-10 10:15:41.637148] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:20.626 "name": "raid_bdev1", 00:20:20.626 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:20.626 "strip_size_kb": 0, 00:20:20.626 "state": "online", 00:20:20.626 "raid_level": "raid1", 00:20:20.626 "superblock": false, 00:20:20.626 "num_base_bdevs": 2, 00:20:20.626 "num_base_bdevs_discovered": 2, 00:20:20.626 "num_base_bdevs_operational": 2, 00:20:20.626 "base_bdevs_list": [ 00:20:20.626 { 00:20:20.626 "name": "spare", 00:20:20.626 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:20.626 "is_configured": true, 00:20:20.626 "data_offset": 0, 00:20:20.626 "data_size": 65536 00:20:20.626 }, 00:20:20.626 { 00:20:20.626 "name": "BaseBdev2", 00:20:20.626 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:20.626 "is_configured": true, 00:20:20.626 "data_offset": 0, 00:20:20.626 "data_size": 65536 00:20:20.626 } 00:20:20.626 ] 00:20:20.626 }' 00:20:20.626 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.886 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:21.147 "name": "raid_bdev1", 00:20:21.147 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:21.147 "strip_size_kb": 0, 00:20:21.147 "state": "online", 00:20:21.147 "raid_level": "raid1", 00:20:21.147 "superblock": false, 00:20:21.147 "num_base_bdevs": 2, 00:20:21.147 "num_base_bdevs_discovered": 2, 00:20:21.147 "num_base_bdevs_operational": 2, 00:20:21.147 "base_bdevs_list": [ 00:20:21.147 { 00:20:21.147 "name": "spare", 00:20:21.147 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:21.147 "is_configured": true, 00:20:21.147 "data_offset": 0, 00:20:21.147 "data_size": 65536 00:20:21.147 }, 00:20:21.147 { 00:20:21.147 "name": "BaseBdev2", 00:20:21.147 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:21.147 "is_configured": true, 00:20:21.147 "data_offset": 0, 00:20:21.147 "data_size": 65536 00:20:21.147 } 00:20:21.147 ] 00:20:21.147 }' 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.147 10:15:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.407 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.407 "name": "raid_bdev1", 00:20:21.407 "uuid": "530497a1-a368-4981-8ddb-eceb3b5ad9a1", 00:20:21.407 "strip_size_kb": 0, 00:20:21.407 "state": "online", 00:20:21.407 "raid_level": "raid1", 00:20:21.407 "superblock": false, 00:20:21.407 "num_base_bdevs": 2, 00:20:21.407 "num_base_bdevs_discovered": 2, 00:20:21.407 "num_base_bdevs_operational": 2, 00:20:21.407 "base_bdevs_list": [ 00:20:21.407 { 00:20:21.407 "name": "spare", 00:20:21.407 "uuid": "c906d1d0-818f-5025-809e-5d4d81c885ce", 00:20:21.407 "is_configured": true, 00:20:21.407 "data_offset": 0, 00:20:21.407 "data_size": 65536 00:20:21.407 }, 00:20:21.407 { 00:20:21.407 "name": "BaseBdev2", 00:20:21.407 "uuid": "d9e19c62-d500-5ac5-b7df-60007a449630", 00:20:21.407 "is_configured": true, 00:20:21.407 "data_offset": 0, 00:20:21.407 "data_size": 65536 00:20:21.407 } 00:20:21.407 ] 00:20:21.407 }' 00:20:21.407 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.407 10:15:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.977 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:21.977 [2024-06-10 10:15:43.769388] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:21.977 [2024-06-10 10:15:43.769405] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:21.977 [2024-06-10 10:15:43.769447] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:21.977 [2024-06-10 10:15:43.769486] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:21.977 [2024-06-10 10:15:43.769492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ca960 name raid_bdev1, state offline 00:20:21.977 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.977 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:22.238 10:15:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:22.497 /dev/nbd0 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:22.497 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:22.497 1+0 records in 00:20:22.497 1+0 records out 00:20:22.497 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269305 s, 15.2 MB/s 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:22.498 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:22.758 /dev/nbd1 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:22.758 1+0 records in 00:20:22.758 1+0 records out 00:20:22.758 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273795 s, 15.0 MB/s 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:22.758 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:23.018 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1069495 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 1069495 ']' 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 1069495 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1069495 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1069495' 00:20:23.279 killing process with pid 1069495 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 1069495 00:20:23.279 Received shutdown signal, test time was about 60.000000 seconds 00:20:23.279 00:20:23.279 Latency(us) 00:20:23.279 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:23.279 =================================================================================================================== 00:20:23.279 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:23.279 [2024-06-10 10:15:44.952341] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:23.279 10:15:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 1069495 00:20:23.279 [2024-06-10 10:15:44.966356] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:23.279 10:15:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:20:23.279 00:20:23.279 real 0m18.135s 00:20:23.279 user 0m25.508s 00:20:23.279 sys 0m2.853s 00:20:23.279 10:15:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:23.279 10:15:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.279 ************************************ 00:20:23.279 END TEST raid_rebuild_test 00:20:23.279 ************************************ 00:20:23.279 10:15:45 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:20:23.279 10:15:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:23.279 10:15:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:23.279 10:15:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:23.540 ************************************ 00:20:23.540 START TEST raid_rebuild_test_sb 00:20:23.540 ************************************ 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:23.540 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1072768 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1072768 /var/tmp/spdk-raid.sock 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1072768 ']' 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:23.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:23.541 10:15:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:23.541 [2024-06-10 10:15:45.224515] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:20:23.541 [2024-06-10 10:15:45.224560] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1072768 ] 00:20:23.541 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:23.541 Zero copy mechanism will not be used. 00:20:23.541 [2024-06-10 10:15:45.310367] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:23.541 [2024-06-10 10:15:45.373601] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:23.801 [2024-06-10 10:15:45.423941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:23.801 [2024-06-10 10:15:45.423967] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:24.372 10:15:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:24.372 10:15:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:20:24.372 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:24.372 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:24.372 BaseBdev1_malloc 00:20:24.633 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:24.633 [2024-06-10 10:15:46.422391] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:24.633 [2024-06-10 10:15:46.422425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:24.633 [2024-06-10 10:15:46.422441] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1daba50 00:20:24.633 [2024-06-10 10:15:46.422447] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:24.633 [2024-06-10 10:15:46.423754] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:24.633 [2024-06-10 10:15:46.423773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:24.633 BaseBdev1 00:20:24.633 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:24.633 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:24.893 BaseBdev2_malloc 00:20:24.893 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:25.154 [2024-06-10 10:15:46.789450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:25.154 [2024-06-10 10:15:46.789476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:25.154 [2024-06-10 10:15:46.789489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dac5a0 00:20:25.154 [2024-06-10 10:15:46.789495] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:25.154 [2024-06-10 10:15:46.790664] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:25.154 [2024-06-10 10:15:46.790682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:25.154 BaseBdev2 00:20:25.154 10:15:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:25.154 spare_malloc 00:20:25.154 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:25.414 spare_delay 00:20:25.414 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:25.675 [2024-06-10 10:15:47.348817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:25.675 [2024-06-10 10:15:47.348851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:25.675 [2024-06-10 10:15:47.348863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5a450 00:20:25.675 [2024-06-10 10:15:47.348869] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:25.675 [2024-06-10 10:15:47.350046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:25.675 [2024-06-10 10:15:47.350064] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:25.675 spare 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:25.675 [2024-06-10 10:15:47.521262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:25.675 [2024-06-10 10:15:47.522244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:25.675 [2024-06-10 10:15:47.522362] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f59960 00:20:25.675 [2024-06-10 10:15:47.522370] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:25.675 [2024-06-10 10:15:47.522513] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1daa530 00:20:25.675 [2024-06-10 10:15:47.522615] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f59960 00:20:25.675 [2024-06-10 10:15:47.522621] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f59960 00:20:25.675 [2024-06-10 10:15:47.522687] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.675 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.936 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.936 "name": "raid_bdev1", 00:20:25.936 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:25.936 "strip_size_kb": 0, 00:20:25.936 "state": "online", 00:20:25.936 "raid_level": "raid1", 00:20:25.936 "superblock": true, 00:20:25.936 "num_base_bdevs": 2, 00:20:25.936 "num_base_bdevs_discovered": 2, 00:20:25.936 "num_base_bdevs_operational": 2, 00:20:25.936 "base_bdevs_list": [ 00:20:25.936 { 00:20:25.936 "name": "BaseBdev1", 00:20:25.936 "uuid": "4d172057-7fb7-591a-8e95-4602472a1f57", 00:20:25.936 "is_configured": true, 00:20:25.936 "data_offset": 2048, 00:20:25.936 "data_size": 63488 00:20:25.936 }, 00:20:25.936 { 00:20:25.936 "name": "BaseBdev2", 00:20:25.936 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:25.936 "is_configured": true, 00:20:25.936 "data_offset": 2048, 00:20:25.936 "data_size": 63488 00:20:25.936 } 00:20:25.936 ] 00:20:25.936 }' 00:20:25.936 10:15:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.936 10:15:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:26.507 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:26.507 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:26.767 [2024-06-10 10:15:48.423698] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:26.767 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:26.767 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.767 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:27.027 [2024-06-10 10:15:48.824577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f5c4f0 00:20:27.027 /dev/nbd0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:27.027 1+0 records in 00:20:27.027 1+0 records out 00:20:27.027 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269304 s, 15.2 MB/s 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:27.027 10:15:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:31.229 63488+0 records in 00:20:31.229 63488+0 records out 00:20:31.229 32505856 bytes (33 MB, 31 MiB) copied, 3.92769 s, 8.3 MB/s 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:31.229 [2024-06-10 10:15:52.974527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:31.229 10:15:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:31.489 [2024-06-10 10:15:53.155018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.489 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.749 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.749 "name": "raid_bdev1", 00:20:31.749 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:31.749 "strip_size_kb": 0, 00:20:31.749 "state": "online", 00:20:31.749 "raid_level": "raid1", 00:20:31.749 "superblock": true, 00:20:31.749 "num_base_bdevs": 2, 00:20:31.749 "num_base_bdevs_discovered": 1, 00:20:31.749 "num_base_bdevs_operational": 1, 00:20:31.749 "base_bdevs_list": [ 00:20:31.749 { 00:20:31.749 "name": null, 00:20:31.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.749 "is_configured": false, 00:20:31.749 "data_offset": 2048, 00:20:31.749 "data_size": 63488 00:20:31.749 }, 00:20:31.749 { 00:20:31.749 "name": "BaseBdev2", 00:20:31.749 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:31.749 "is_configured": true, 00:20:31.749 "data_offset": 2048, 00:20:31.749 "data_size": 63488 00:20:31.749 } 00:20:31.749 ] 00:20:31.749 }' 00:20:31.749 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.749 10:15:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:32.319 10:15:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:32.319 [2024-06-10 10:15:54.093387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:32.319 [2024-06-10 10:15:54.096691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dab130 00:20:32.319 [2024-06-10 10:15:54.098229] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:32.319 10:15:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.260 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.520 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:33.520 "name": "raid_bdev1", 00:20:33.520 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:33.520 "strip_size_kb": 0, 00:20:33.520 "state": "online", 00:20:33.520 "raid_level": "raid1", 00:20:33.520 "superblock": true, 00:20:33.520 "num_base_bdevs": 2, 00:20:33.520 "num_base_bdevs_discovered": 2, 00:20:33.520 "num_base_bdevs_operational": 2, 00:20:33.520 "process": { 00:20:33.520 "type": "rebuild", 00:20:33.520 "target": "spare", 00:20:33.520 "progress": { 00:20:33.520 "blocks": 22528, 00:20:33.520 "percent": 35 00:20:33.520 } 00:20:33.520 }, 00:20:33.520 "base_bdevs_list": [ 00:20:33.520 { 00:20:33.520 "name": "spare", 00:20:33.520 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:33.520 "is_configured": true, 00:20:33.520 "data_offset": 2048, 00:20:33.520 "data_size": 63488 00:20:33.520 }, 00:20:33.520 { 00:20:33.520 "name": "BaseBdev2", 00:20:33.520 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:33.520 "is_configured": true, 00:20:33.520 "data_offset": 2048, 00:20:33.520 "data_size": 63488 00:20:33.520 } 00:20:33.520 ] 00:20:33.520 }' 00:20:33.520 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:33.520 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:33.520 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:33.780 [2024-06-10 10:15:55.571009] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:33.780 [2024-06-10 10:15:55.607084] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:33.780 [2024-06-10 10:15:55.607116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.780 [2024-06-10 10:15:55.607125] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:33.780 [2024-06-10 10:15:55.607129] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.780 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.089 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.089 "name": "raid_bdev1", 00:20:34.089 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:34.089 "strip_size_kb": 0, 00:20:34.089 "state": "online", 00:20:34.089 "raid_level": "raid1", 00:20:34.089 "superblock": true, 00:20:34.089 "num_base_bdevs": 2, 00:20:34.089 "num_base_bdevs_discovered": 1, 00:20:34.089 "num_base_bdevs_operational": 1, 00:20:34.089 "base_bdevs_list": [ 00:20:34.089 { 00:20:34.089 "name": null, 00:20:34.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.089 "is_configured": false, 00:20:34.089 "data_offset": 2048, 00:20:34.089 "data_size": 63488 00:20:34.089 }, 00:20:34.089 { 00:20:34.089 "name": "BaseBdev2", 00:20:34.089 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:34.089 "is_configured": true, 00:20:34.089 "data_offset": 2048, 00:20:34.089 "data_size": 63488 00:20:34.089 } 00:20:34.089 ] 00:20:34.089 }' 00:20:34.089 10:15:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.089 10:15:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.661 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.922 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:34.922 "name": "raid_bdev1", 00:20:34.922 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:34.922 "strip_size_kb": 0, 00:20:34.922 "state": "online", 00:20:34.922 "raid_level": "raid1", 00:20:34.922 "superblock": true, 00:20:34.922 "num_base_bdevs": 2, 00:20:34.922 "num_base_bdevs_discovered": 1, 00:20:34.922 "num_base_bdevs_operational": 1, 00:20:34.922 "base_bdevs_list": [ 00:20:34.922 { 00:20:34.922 "name": null, 00:20:34.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.922 "is_configured": false, 00:20:34.922 "data_offset": 2048, 00:20:34.922 "data_size": 63488 00:20:34.922 }, 00:20:34.922 { 00:20:34.922 "name": "BaseBdev2", 00:20:34.922 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:34.922 "is_configured": true, 00:20:34.922 "data_offset": 2048, 00:20:34.922 "data_size": 63488 00:20:34.922 } 00:20:34.922 ] 00:20:34.922 }' 00:20:34.923 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:34.923 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:34.923 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:34.923 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:34.923 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:35.183 [2024-06-10 10:15:56.838223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:35.183 [2024-06-10 10:15:56.841625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f56100 00:20:35.183 [2024-06-10 10:15:56.842762] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:35.183 10:15:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.124 10:15:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:36.384 "name": "raid_bdev1", 00:20:36.384 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:36.384 "strip_size_kb": 0, 00:20:36.384 "state": "online", 00:20:36.384 "raid_level": "raid1", 00:20:36.384 "superblock": true, 00:20:36.384 "num_base_bdevs": 2, 00:20:36.384 "num_base_bdevs_discovered": 2, 00:20:36.384 "num_base_bdevs_operational": 2, 00:20:36.384 "process": { 00:20:36.384 "type": "rebuild", 00:20:36.384 "target": "spare", 00:20:36.384 "progress": { 00:20:36.384 "blocks": 22528, 00:20:36.384 "percent": 35 00:20:36.384 } 00:20:36.384 }, 00:20:36.384 "base_bdevs_list": [ 00:20:36.384 { 00:20:36.384 "name": "spare", 00:20:36.384 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:36.384 "is_configured": true, 00:20:36.384 "data_offset": 2048, 00:20:36.384 "data_size": 63488 00:20:36.384 }, 00:20:36.384 { 00:20:36.384 "name": "BaseBdev2", 00:20:36.384 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:36.384 "is_configured": true, 00:20:36.384 "data_offset": 2048, 00:20:36.384 "data_size": 63488 00:20:36.384 } 00:20:36.384 ] 00:20:36.384 }' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:36.384 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=650 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.384 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:36.644 "name": "raid_bdev1", 00:20:36.644 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:36.644 "strip_size_kb": 0, 00:20:36.644 "state": "online", 00:20:36.644 "raid_level": "raid1", 00:20:36.644 "superblock": true, 00:20:36.644 "num_base_bdevs": 2, 00:20:36.644 "num_base_bdevs_discovered": 2, 00:20:36.644 "num_base_bdevs_operational": 2, 00:20:36.644 "process": { 00:20:36.644 "type": "rebuild", 00:20:36.644 "target": "spare", 00:20:36.644 "progress": { 00:20:36.644 "blocks": 28672, 00:20:36.644 "percent": 45 00:20:36.644 } 00:20:36.644 }, 00:20:36.644 "base_bdevs_list": [ 00:20:36.644 { 00:20:36.644 "name": "spare", 00:20:36.644 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:36.644 "is_configured": true, 00:20:36.644 "data_offset": 2048, 00:20:36.644 "data_size": 63488 00:20:36.644 }, 00:20:36.644 { 00:20:36.644 "name": "BaseBdev2", 00:20:36.644 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:36.644 "is_configured": true, 00:20:36.644 "data_offset": 2048, 00:20:36.644 "data_size": 63488 00:20:36.644 } 00:20:36.644 ] 00:20:36.644 }' 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:36.644 10:15:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.586 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:37.848 "name": "raid_bdev1", 00:20:37.848 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:37.848 "strip_size_kb": 0, 00:20:37.848 "state": "online", 00:20:37.848 "raid_level": "raid1", 00:20:37.848 "superblock": true, 00:20:37.848 "num_base_bdevs": 2, 00:20:37.848 "num_base_bdevs_discovered": 2, 00:20:37.848 "num_base_bdevs_operational": 2, 00:20:37.848 "process": { 00:20:37.848 "type": "rebuild", 00:20:37.848 "target": "spare", 00:20:37.848 "progress": { 00:20:37.848 "blocks": 55296, 00:20:37.848 "percent": 87 00:20:37.848 } 00:20:37.848 }, 00:20:37.848 "base_bdevs_list": [ 00:20:37.848 { 00:20:37.848 "name": "spare", 00:20:37.848 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:37.848 "is_configured": true, 00:20:37.848 "data_offset": 2048, 00:20:37.848 "data_size": 63488 00:20:37.848 }, 00:20:37.848 { 00:20:37.848 "name": "BaseBdev2", 00:20:37.848 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:37.848 "is_configured": true, 00:20:37.848 "data_offset": 2048, 00:20:37.848 "data_size": 63488 00:20:37.848 } 00:20:37.848 ] 00:20:37.848 }' 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:37.848 10:15:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:38.108 [2024-06-10 10:15:59.960978] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:38.108 [2024-06-10 10:15:59.961024] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:38.108 [2024-06-10 10:15:59.961085] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.048 "name": "raid_bdev1", 00:20:39.048 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:39.048 "strip_size_kb": 0, 00:20:39.048 "state": "online", 00:20:39.048 "raid_level": "raid1", 00:20:39.048 "superblock": true, 00:20:39.048 "num_base_bdevs": 2, 00:20:39.048 "num_base_bdevs_discovered": 2, 00:20:39.048 "num_base_bdevs_operational": 2, 00:20:39.048 "base_bdevs_list": [ 00:20:39.048 { 00:20:39.048 "name": "spare", 00:20:39.048 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:39.048 "is_configured": true, 00:20:39.048 "data_offset": 2048, 00:20:39.048 "data_size": 63488 00:20:39.048 }, 00:20:39.048 { 00:20:39.048 "name": "BaseBdev2", 00:20:39.048 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:39.048 "is_configured": true, 00:20:39.048 "data_offset": 2048, 00:20:39.048 "data_size": 63488 00:20:39.048 } 00:20:39.048 ] 00:20:39.048 }' 00:20:39.048 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.307 10:16:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.307 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.307 "name": "raid_bdev1", 00:20:39.307 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:39.307 "strip_size_kb": 0, 00:20:39.307 "state": "online", 00:20:39.307 "raid_level": "raid1", 00:20:39.307 "superblock": true, 00:20:39.307 "num_base_bdevs": 2, 00:20:39.307 "num_base_bdevs_discovered": 2, 00:20:39.307 "num_base_bdevs_operational": 2, 00:20:39.307 "base_bdevs_list": [ 00:20:39.307 { 00:20:39.307 "name": "spare", 00:20:39.307 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:39.307 "is_configured": true, 00:20:39.307 "data_offset": 2048, 00:20:39.307 "data_size": 63488 00:20:39.307 }, 00:20:39.307 { 00:20:39.307 "name": "BaseBdev2", 00:20:39.307 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:39.307 "is_configured": true, 00:20:39.307 "data_offset": 2048, 00:20:39.307 "data_size": 63488 00:20:39.307 } 00:20:39.307 ] 00:20:39.307 }' 00:20:39.307 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.567 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.826 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.826 "name": "raid_bdev1", 00:20:39.826 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:39.826 "strip_size_kb": 0, 00:20:39.826 "state": "online", 00:20:39.826 "raid_level": "raid1", 00:20:39.826 "superblock": true, 00:20:39.826 "num_base_bdevs": 2, 00:20:39.826 "num_base_bdevs_discovered": 2, 00:20:39.826 "num_base_bdevs_operational": 2, 00:20:39.826 "base_bdevs_list": [ 00:20:39.826 { 00:20:39.826 "name": "spare", 00:20:39.826 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:39.826 "is_configured": true, 00:20:39.826 "data_offset": 2048, 00:20:39.826 "data_size": 63488 00:20:39.826 }, 00:20:39.826 { 00:20:39.826 "name": "BaseBdev2", 00:20:39.826 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:39.826 "is_configured": true, 00:20:39.826 "data_offset": 2048, 00:20:39.826 "data_size": 63488 00:20:39.826 } 00:20:39.826 ] 00:20:39.826 }' 00:20:39.826 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.826 10:16:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.394 10:16:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:40.394 [2024-06-10 10:16:02.150698] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:40.394 [2024-06-10 10:16:02.150717] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:40.394 [2024-06-10 10:16:02.150763] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:40.394 [2024-06-10 10:16:02.150804] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:40.394 [2024-06-10 10:16:02.150811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f59960 name raid_bdev1, state offline 00:20:40.394 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.394 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:40.653 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:40.913 /dev/nbd0 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:40.913 1+0 records in 00:20:40.913 1+0 records out 00:20:40.913 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328161 s, 12.5 MB/s 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:40.913 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:41.172 /dev/nbd1 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:41.172 1+0 records in 00:20:41.172 1+0 records out 00:20:41.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024463 s, 16.7 MB/s 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:41.172 10:16:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:41.431 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:41.690 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:41.950 [2024-06-10 10:16:03.646414] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:41.950 [2024-06-10 10:16:03.646445] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.950 [2024-06-10 10:16:03.646460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f572c0 00:20:41.950 [2024-06-10 10:16:03.646467] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.950 [2024-06-10 10:16:03.647780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.950 [2024-06-10 10:16:03.647802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:41.950 [2024-06-10 10:16:03.647873] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:41.950 [2024-06-10 10:16:03.647900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.950 [2024-06-10 10:16:03.647979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:41.950 spare 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.950 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.950 [2024-06-10 10:16:03.748268] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db0cf0 00:20:41.950 [2024-06-10 10:16:03.748277] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:41.950 [2024-06-10 10:16:03.748432] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dae1d0 00:20:41.950 [2024-06-10 10:16:03.748545] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db0cf0 00:20:41.950 [2024-06-10 10:16:03.748550] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db0cf0 00:20:41.950 [2024-06-10 10:16:03.748629] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.209 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.209 "name": "raid_bdev1", 00:20:42.209 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:42.209 "strip_size_kb": 0, 00:20:42.209 "state": "online", 00:20:42.209 "raid_level": "raid1", 00:20:42.209 "superblock": true, 00:20:42.209 "num_base_bdevs": 2, 00:20:42.209 "num_base_bdevs_discovered": 2, 00:20:42.209 "num_base_bdevs_operational": 2, 00:20:42.209 "base_bdevs_list": [ 00:20:42.209 { 00:20:42.209 "name": "spare", 00:20:42.209 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:42.209 "is_configured": true, 00:20:42.209 "data_offset": 2048, 00:20:42.209 "data_size": 63488 00:20:42.209 }, 00:20:42.210 { 00:20:42.210 "name": "BaseBdev2", 00:20:42.210 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:42.210 "is_configured": true, 00:20:42.210 "data_offset": 2048, 00:20:42.210 "data_size": 63488 00:20:42.210 } 00:20:42.210 ] 00:20:42.210 }' 00:20:42.210 10:16:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.210 10:16:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:42.779 "name": "raid_bdev1", 00:20:42.779 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:42.779 "strip_size_kb": 0, 00:20:42.779 "state": "online", 00:20:42.779 "raid_level": "raid1", 00:20:42.779 "superblock": true, 00:20:42.779 "num_base_bdevs": 2, 00:20:42.779 "num_base_bdevs_discovered": 2, 00:20:42.779 "num_base_bdevs_operational": 2, 00:20:42.779 "base_bdevs_list": [ 00:20:42.779 { 00:20:42.779 "name": "spare", 00:20:42.779 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:42.779 "is_configured": true, 00:20:42.779 "data_offset": 2048, 00:20:42.779 "data_size": 63488 00:20:42.779 }, 00:20:42.779 { 00:20:42.779 "name": "BaseBdev2", 00:20:42.779 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:42.779 "is_configured": true, 00:20:42.779 "data_offset": 2048, 00:20:42.779 "data_size": 63488 00:20:42.779 } 00:20:42.779 ] 00:20:42.779 }' 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:42.779 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.038 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:43.038 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.038 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:43.038 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.038 10:16:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:43.298 [2024-06-10 10:16:05.017965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.298 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.299 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.299 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.558 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.558 "name": "raid_bdev1", 00:20:43.558 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:43.558 "strip_size_kb": 0, 00:20:43.558 "state": "online", 00:20:43.558 "raid_level": "raid1", 00:20:43.558 "superblock": true, 00:20:43.558 "num_base_bdevs": 2, 00:20:43.558 "num_base_bdevs_discovered": 1, 00:20:43.558 "num_base_bdevs_operational": 1, 00:20:43.558 "base_bdevs_list": [ 00:20:43.559 { 00:20:43.559 "name": null, 00:20:43.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.559 "is_configured": false, 00:20:43.559 "data_offset": 2048, 00:20:43.559 "data_size": 63488 00:20:43.559 }, 00:20:43.559 { 00:20:43.559 "name": "BaseBdev2", 00:20:43.559 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:43.559 "is_configured": true, 00:20:43.559 "data_offset": 2048, 00:20:43.559 "data_size": 63488 00:20:43.559 } 00:20:43.559 ] 00:20:43.559 }' 00:20:43.559 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.559 10:16:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:44.128 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:44.128 [2024-06-10 10:16:05.948333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:44.128 [2024-06-10 10:16:05.948452] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:44.128 [2024-06-10 10:16:05.948467] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:44.128 [2024-06-10 10:16:05.948486] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:44.128 [2024-06-10 10:16:05.951812] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f57da0 00:20:44.128 [2024-06-10 10:16:05.953414] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:44.128 10:16:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.509 10:16:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.509 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:45.509 "name": "raid_bdev1", 00:20:45.509 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:45.509 "strip_size_kb": 0, 00:20:45.509 "state": "online", 00:20:45.509 "raid_level": "raid1", 00:20:45.509 "superblock": true, 00:20:45.509 "num_base_bdevs": 2, 00:20:45.509 "num_base_bdevs_discovered": 2, 00:20:45.509 "num_base_bdevs_operational": 2, 00:20:45.509 "process": { 00:20:45.509 "type": "rebuild", 00:20:45.509 "target": "spare", 00:20:45.509 "progress": { 00:20:45.509 "blocks": 22528, 00:20:45.509 "percent": 35 00:20:45.509 } 00:20:45.509 }, 00:20:45.509 "base_bdevs_list": [ 00:20:45.509 { 00:20:45.509 "name": "spare", 00:20:45.509 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:45.509 "is_configured": true, 00:20:45.509 "data_offset": 2048, 00:20:45.509 "data_size": 63488 00:20:45.510 }, 00:20:45.510 { 00:20:45.510 "name": "BaseBdev2", 00:20:45.510 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:45.510 "is_configured": true, 00:20:45.510 "data_offset": 2048, 00:20:45.510 "data_size": 63488 00:20:45.510 } 00:20:45.510 ] 00:20:45.510 }' 00:20:45.510 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:45.510 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:45.510 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:45.510 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:45.510 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:45.770 [2024-06-10 10:16:07.433901] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:45.770 [2024-06-10 10:16:07.462228] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:45.770 [2024-06-10 10:16:07.462259] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.770 [2024-06-10 10:16:07.462268] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:45.770 [2024-06-10 10:16:07.462272] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.770 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.030 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.030 "name": "raid_bdev1", 00:20:46.030 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:46.030 "strip_size_kb": 0, 00:20:46.030 "state": "online", 00:20:46.030 "raid_level": "raid1", 00:20:46.030 "superblock": true, 00:20:46.030 "num_base_bdevs": 2, 00:20:46.030 "num_base_bdevs_discovered": 1, 00:20:46.030 "num_base_bdevs_operational": 1, 00:20:46.030 "base_bdevs_list": [ 00:20:46.030 { 00:20:46.030 "name": null, 00:20:46.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.030 "is_configured": false, 00:20:46.030 "data_offset": 2048, 00:20:46.030 "data_size": 63488 00:20:46.030 }, 00:20:46.030 { 00:20:46.030 "name": "BaseBdev2", 00:20:46.030 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:46.030 "is_configured": true, 00:20:46.030 "data_offset": 2048, 00:20:46.030 "data_size": 63488 00:20:46.030 } 00:20:46.030 ] 00:20:46.030 }' 00:20:46.030 10:16:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.030 10:16:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.599 10:16:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:46.599 [2024-06-10 10:16:08.384350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:46.599 [2024-06-10 10:16:08.384386] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.600 [2024-06-10 10:16:08.384405] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5a8b0 00:20:46.600 [2024-06-10 10:16:08.384412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.600 [2024-06-10 10:16:08.384710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.600 [2024-06-10 10:16:08.384722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:46.600 [2024-06-10 10:16:08.384779] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:46.600 [2024-06-10 10:16:08.384786] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:46.600 [2024-06-10 10:16:08.384791] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:46.600 [2024-06-10 10:16:08.384802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:46.600 [2024-06-10 10:16:08.388042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f577a0 00:20:46.600 [2024-06-10 10:16:08.389166] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:46.600 spare 00:20:46.600 10:16:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:47.982 "name": "raid_bdev1", 00:20:47.982 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:47.982 "strip_size_kb": 0, 00:20:47.982 "state": "online", 00:20:47.982 "raid_level": "raid1", 00:20:47.982 "superblock": true, 00:20:47.982 "num_base_bdevs": 2, 00:20:47.982 "num_base_bdevs_discovered": 2, 00:20:47.982 "num_base_bdevs_operational": 2, 00:20:47.982 "process": { 00:20:47.982 "type": "rebuild", 00:20:47.982 "target": "spare", 00:20:47.982 "progress": { 00:20:47.982 "blocks": 22528, 00:20:47.982 "percent": 35 00:20:47.982 } 00:20:47.982 }, 00:20:47.982 "base_bdevs_list": [ 00:20:47.982 { 00:20:47.982 "name": "spare", 00:20:47.982 "uuid": "8bedad51-0f32-5d27-9158-ddd3f920c1c7", 00:20:47.982 "is_configured": true, 00:20:47.982 "data_offset": 2048, 00:20:47.982 "data_size": 63488 00:20:47.982 }, 00:20:47.982 { 00:20:47.982 "name": "BaseBdev2", 00:20:47.982 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:47.982 "is_configured": true, 00:20:47.982 "data_offset": 2048, 00:20:47.982 "data_size": 63488 00:20:47.982 } 00:20:47.982 ] 00:20:47.982 }' 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:47.982 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:48.242 [2024-06-10 10:16:09.857933] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:48.243 [2024-06-10 10:16:09.897984] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:48.243 [2024-06-10 10:16:09.898015] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.243 [2024-06-10 10:16:09.898025] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:48.243 [2024-06-10 10:16:09.898029] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.243 10:16:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.504 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.504 "name": "raid_bdev1", 00:20:48.504 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:48.504 "strip_size_kb": 0, 00:20:48.504 "state": "online", 00:20:48.504 "raid_level": "raid1", 00:20:48.504 "superblock": true, 00:20:48.504 "num_base_bdevs": 2, 00:20:48.504 "num_base_bdevs_discovered": 1, 00:20:48.504 "num_base_bdevs_operational": 1, 00:20:48.504 "base_bdevs_list": [ 00:20:48.504 { 00:20:48.504 "name": null, 00:20:48.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.504 "is_configured": false, 00:20:48.504 "data_offset": 2048, 00:20:48.504 "data_size": 63488 00:20:48.504 }, 00:20:48.504 { 00:20:48.504 "name": "BaseBdev2", 00:20:48.504 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:48.504 "is_configured": true, 00:20:48.504 "data_offset": 2048, 00:20:48.504 "data_size": 63488 00:20:48.504 } 00:20:48.504 ] 00:20:48.504 }' 00:20:48.504 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.504 10:16:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:49.073 "name": "raid_bdev1", 00:20:49.073 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:49.073 "strip_size_kb": 0, 00:20:49.073 "state": "online", 00:20:49.073 "raid_level": "raid1", 00:20:49.073 "superblock": true, 00:20:49.073 "num_base_bdevs": 2, 00:20:49.073 "num_base_bdevs_discovered": 1, 00:20:49.073 "num_base_bdevs_operational": 1, 00:20:49.073 "base_bdevs_list": [ 00:20:49.073 { 00:20:49.073 "name": null, 00:20:49.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.073 "is_configured": false, 00:20:49.073 "data_offset": 2048, 00:20:49.073 "data_size": 63488 00:20:49.073 }, 00:20:49.073 { 00:20:49.073 "name": "BaseBdev2", 00:20:49.073 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:49.073 "is_configured": true, 00:20:49.073 "data_offset": 2048, 00:20:49.073 "data_size": 63488 00:20:49.073 } 00:20:49.073 ] 00:20:49.073 }' 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:49.073 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:49.333 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:49.333 10:16:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:49.333 10:16:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:49.592 [2024-06-10 10:16:11.329636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:49.592 [2024-06-10 10:16:11.329666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.592 [2024-06-10 10:16:11.329680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5c9d0 00:20:49.592 [2024-06-10 10:16:11.329686] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.592 [2024-06-10 10:16:11.329959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.592 [2024-06-10 10:16:11.329971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:49.592 [2024-06-10 10:16:11.330015] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:49.592 [2024-06-10 10:16:11.330022] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:49.592 [2024-06-10 10:16:11.330027] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:49.592 BaseBdev1 00:20:49.592 10:16:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.532 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.793 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.793 "name": "raid_bdev1", 00:20:50.793 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:50.793 "strip_size_kb": 0, 00:20:50.793 "state": "online", 00:20:50.793 "raid_level": "raid1", 00:20:50.793 "superblock": true, 00:20:50.793 "num_base_bdevs": 2, 00:20:50.793 "num_base_bdevs_discovered": 1, 00:20:50.793 "num_base_bdevs_operational": 1, 00:20:50.793 "base_bdevs_list": [ 00:20:50.793 { 00:20:50.793 "name": null, 00:20:50.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.793 "is_configured": false, 00:20:50.793 "data_offset": 2048, 00:20:50.793 "data_size": 63488 00:20:50.793 }, 00:20:50.793 { 00:20:50.793 "name": "BaseBdev2", 00:20:50.793 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:50.793 "is_configured": true, 00:20:50.793 "data_offset": 2048, 00:20:50.793 "data_size": 63488 00:20:50.793 } 00:20:50.793 ] 00:20:50.793 }' 00:20:50.793 10:16:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.793 10:16:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.364 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:51.626 "name": "raid_bdev1", 00:20:51.626 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:51.626 "strip_size_kb": 0, 00:20:51.626 "state": "online", 00:20:51.626 "raid_level": "raid1", 00:20:51.626 "superblock": true, 00:20:51.626 "num_base_bdevs": 2, 00:20:51.626 "num_base_bdevs_discovered": 1, 00:20:51.626 "num_base_bdevs_operational": 1, 00:20:51.626 "base_bdevs_list": [ 00:20:51.626 { 00:20:51.626 "name": null, 00:20:51.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.626 "is_configured": false, 00:20:51.626 "data_offset": 2048, 00:20:51.626 "data_size": 63488 00:20:51.626 }, 00:20:51.626 { 00:20:51.626 "name": "BaseBdev2", 00:20:51.626 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:51.626 "is_configured": true, 00:20:51.626 "data_offset": 2048, 00:20:51.626 "data_size": 63488 00:20:51.626 } 00:20:51.626 ] 00:20:51.626 }' 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:51.626 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:51.887 [2024-06-10 10:16:13.547251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:51.887 [2024-06-10 10:16:13.547343] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:51.887 [2024-06-10 10:16:13.547351] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:51.887 request: 00:20:51.887 { 00:20:51.887 "raid_bdev": "raid_bdev1", 00:20:51.887 "base_bdev": "BaseBdev1", 00:20:51.887 "method": "bdev_raid_add_base_bdev", 00:20:51.887 "req_id": 1 00:20:51.887 } 00:20:51.887 Got JSON-RPC error response 00:20:51.887 response: 00:20:51.887 { 00:20:51.887 "code": -22, 00:20:51.887 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:51.887 } 00:20:51.887 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:20:51.887 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:20:51.887 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:20:51.887 10:16:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:20:51.887 10:16:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.829 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.089 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.089 "name": "raid_bdev1", 00:20:53.089 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:53.089 "strip_size_kb": 0, 00:20:53.089 "state": "online", 00:20:53.089 "raid_level": "raid1", 00:20:53.089 "superblock": true, 00:20:53.089 "num_base_bdevs": 2, 00:20:53.089 "num_base_bdevs_discovered": 1, 00:20:53.089 "num_base_bdevs_operational": 1, 00:20:53.089 "base_bdevs_list": [ 00:20:53.089 { 00:20:53.089 "name": null, 00:20:53.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.089 "is_configured": false, 00:20:53.089 "data_offset": 2048, 00:20:53.089 "data_size": 63488 00:20:53.089 }, 00:20:53.089 { 00:20:53.089 "name": "BaseBdev2", 00:20:53.089 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:53.089 "is_configured": true, 00:20:53.089 "data_offset": 2048, 00:20:53.089 "data_size": 63488 00:20:53.089 } 00:20:53.089 ] 00:20:53.089 }' 00:20:53.089 10:16:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.089 10:16:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:53.660 "name": "raid_bdev1", 00:20:53.660 "uuid": "fdad1d7b-10a1-4f5f-81d1-b6ee4ec2042c", 00:20:53.660 "strip_size_kb": 0, 00:20:53.660 "state": "online", 00:20:53.660 "raid_level": "raid1", 00:20:53.660 "superblock": true, 00:20:53.660 "num_base_bdevs": 2, 00:20:53.660 "num_base_bdevs_discovered": 1, 00:20:53.660 "num_base_bdevs_operational": 1, 00:20:53.660 "base_bdevs_list": [ 00:20:53.660 { 00:20:53.660 "name": null, 00:20:53.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.660 "is_configured": false, 00:20:53.660 "data_offset": 2048, 00:20:53.660 "data_size": 63488 00:20:53.660 }, 00:20:53.660 { 00:20:53.660 "name": "BaseBdev2", 00:20:53.660 "uuid": "405d4ee9-4101-5d3c-b7f5-33c2ebc72589", 00:20:53.660 "is_configured": true, 00:20:53.660 "data_offset": 2048, 00:20:53.660 "data_size": 63488 00:20:53.660 } 00:20:53.660 ] 00:20:53.660 }' 00:20:53.660 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1072768 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1072768 ']' 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 1072768 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1072768 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1072768' 00:20:53.922 killing process with pid 1072768 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 1072768 00:20:53.922 Received shutdown signal, test time was about 60.000000 seconds 00:20:53.922 00:20:53.922 Latency(us) 00:20:53.922 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:53.922 =================================================================================================================== 00:20:53.922 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:53.922 [2024-06-10 10:16:15.652886] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:53.922 [2024-06-10 10:16:15.652956] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:53.922 [2024-06-10 10:16:15.652992] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:53.922 [2024-06-10 10:16:15.652999] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db0cf0 name raid_bdev1, state offline 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 1072768 00:20:53.922 [2024-06-10 10:16:15.668052] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:53.922 00:20:53.922 real 0m30.621s 00:20:53.922 user 0m45.555s 00:20:53.922 sys 0m4.146s 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:53.922 10:16:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.922 ************************************ 00:20:53.922 END TEST raid_rebuild_test_sb 00:20:53.922 ************************************ 00:20:54.183 10:16:15 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:20:54.183 10:16:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:54.183 10:16:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:54.183 10:16:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:54.183 ************************************ 00:20:54.183 START TEST raid_rebuild_test_io 00:20:54.183 ************************************ 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false true true 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1078405 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1078405 /var/tmp/spdk-raid.sock 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 1078405 ']' 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:54.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:54.183 10:16:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:54.183 [2024-06-10 10:16:15.923623] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:20:54.183 [2024-06-10 10:16:15.923667] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1078405 ] 00:20:54.183 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:54.183 Zero copy mechanism will not be used. 00:20:54.183 [2024-06-10 10:16:15.991687] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:54.444 [2024-06-10 10:16:16.054687] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:54.444 [2024-06-10 10:16:16.095982] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:54.444 [2024-06-10 10:16:16.096005] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:55.015 10:16:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:55.015 10:16:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:20:55.015 10:16:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:55.015 10:16:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:55.275 BaseBdev1_malloc 00:20:55.275 10:16:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:55.275 [2024-06-10 10:16:17.117838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:55.275 [2024-06-10 10:16:17.117875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:55.275 [2024-06-10 10:16:17.117890] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2465a50 00:20:55.276 [2024-06-10 10:16:17.117896] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:55.276 [2024-06-10 10:16:17.119277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:55.276 [2024-06-10 10:16:17.119297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:55.276 BaseBdev1 00:20:55.276 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:55.276 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:55.536 BaseBdev2_malloc 00:20:55.536 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:55.796 [2024-06-10 10:16:17.492531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:55.796 [2024-06-10 10:16:17.492559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:55.796 [2024-06-10 10:16:17.492574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24665a0 00:20:55.796 [2024-06-10 10:16:17.492580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:55.796 [2024-06-10 10:16:17.493756] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:55.796 [2024-06-10 10:16:17.493774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:55.796 BaseBdev2 00:20:55.796 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:56.057 spare_malloc 00:20:56.057 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:56.057 spare_delay 00:20:56.057 10:16:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:56.317 [2024-06-10 10:16:18.035481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:56.317 [2024-06-10 10:16:18.035511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.317 [2024-06-10 10:16:18.035522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2614450 00:20:56.317 [2024-06-10 10:16:18.035528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.317 [2024-06-10 10:16:18.036690] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.317 [2024-06-10 10:16:18.036709] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:56.317 spare 00:20:56.317 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:56.579 [2024-06-10 10:16:18.215946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:56.579 [2024-06-10 10:16:18.216906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:56.579 [2024-06-10 10:16:18.216960] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2613960 00:20:56.579 [2024-06-10 10:16:18.216966] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:56.579 [2024-06-10 10:16:18.217111] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2610340 00:20:56.579 [2024-06-10 10:16:18.217216] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2613960 00:20:56.579 [2024-06-10 10:16:18.217221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2613960 00:20:56.579 [2024-06-10 10:16:18.217297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:56.579 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.580 "name": "raid_bdev1", 00:20:56.580 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:20:56.580 "strip_size_kb": 0, 00:20:56.580 "state": "online", 00:20:56.580 "raid_level": "raid1", 00:20:56.580 "superblock": false, 00:20:56.580 "num_base_bdevs": 2, 00:20:56.580 "num_base_bdevs_discovered": 2, 00:20:56.580 "num_base_bdevs_operational": 2, 00:20:56.580 "base_bdevs_list": [ 00:20:56.580 { 00:20:56.580 "name": "BaseBdev1", 00:20:56.580 "uuid": "54802aa2-a5fe-56ae-bb62-0c36170e836c", 00:20:56.580 "is_configured": true, 00:20:56.580 "data_offset": 0, 00:20:56.580 "data_size": 65536 00:20:56.580 }, 00:20:56.580 { 00:20:56.580 "name": "BaseBdev2", 00:20:56.580 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:20:56.580 "is_configured": true, 00:20:56.580 "data_offset": 0, 00:20:56.580 "data_size": 65536 00:20:56.580 } 00:20:56.580 ] 00:20:56.580 }' 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.580 10:16:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:57.207 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:57.207 10:16:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:57.467 [2024-06-10 10:16:19.142460] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:57.467 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:57.467 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.467 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:57.727 [2024-06-10 10:16:19.448422] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2616170 00:20:57.727 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:57.727 Zero copy mechanism will not be used. 00:20:57.727 Running I/O for 60 seconds... 00:20:57.727 [2024-06-10 10:16:19.531163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:57.727 [2024-06-10 10:16:19.531315] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2616170 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.727 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.987 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.987 "name": "raid_bdev1", 00:20:57.987 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:20:57.987 "strip_size_kb": 0, 00:20:57.987 "state": "online", 00:20:57.987 "raid_level": "raid1", 00:20:57.987 "superblock": false, 00:20:57.987 "num_base_bdevs": 2, 00:20:57.987 "num_base_bdevs_discovered": 1, 00:20:57.987 "num_base_bdevs_operational": 1, 00:20:57.987 "base_bdevs_list": [ 00:20:57.987 { 00:20:57.987 "name": null, 00:20:57.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.987 "is_configured": false, 00:20:57.987 "data_offset": 0, 00:20:57.987 "data_size": 65536 00:20:57.987 }, 00:20:57.987 { 00:20:57.987 "name": "BaseBdev2", 00:20:57.987 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:20:57.987 "is_configured": true, 00:20:57.987 "data_offset": 0, 00:20:57.987 "data_size": 65536 00:20:57.987 } 00:20:57.987 ] 00:20:57.987 }' 00:20:57.987 10:16:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.987 10:16:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:58.556 10:16:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:58.816 [2024-06-10 10:16:20.508562] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:58.816 10:16:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:58.816 [2024-06-10 10:16:20.547898] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2616680 00:20:58.816 [2024-06-10 10:16:20.549494] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:58.816 [2024-06-10 10:16:20.670887] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:58.816 [2024-06-10 10:16:20.671099] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:59.075 [2024-06-10 10:16:20.886491] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:59.075 [2024-06-10 10:16:20.886592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:59.646 [2024-06-10 10:16:21.209851] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:59.646 [2024-06-10 10:16:21.210080] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:59.646 [2024-06-10 10:16:21.417771] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:59.646 [2024-06-10 10:16:21.417899] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:59.906 "name": "raid_bdev1", 00:20:59.906 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:20:59.906 "strip_size_kb": 0, 00:20:59.906 "state": "online", 00:20:59.906 "raid_level": "raid1", 00:20:59.906 "superblock": false, 00:20:59.906 "num_base_bdevs": 2, 00:20:59.906 "num_base_bdevs_discovered": 2, 00:20:59.906 "num_base_bdevs_operational": 2, 00:20:59.906 "process": { 00:20:59.906 "type": "rebuild", 00:20:59.906 "target": "spare", 00:20:59.906 "progress": { 00:20:59.906 "blocks": 12288, 00:20:59.906 "percent": 18 00:20:59.906 } 00:20:59.906 }, 00:20:59.906 "base_bdevs_list": [ 00:20:59.906 { 00:20:59.906 "name": "spare", 00:20:59.906 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:20:59.906 "is_configured": true, 00:20:59.906 "data_offset": 0, 00:20:59.906 "data_size": 65536 00:20:59.906 }, 00:20:59.906 { 00:20:59.906 "name": "BaseBdev2", 00:20:59.906 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:20:59.906 "is_configured": true, 00:20:59.906 "data_offset": 0, 00:20:59.906 "data_size": 65536 00:20:59.906 } 00:20:59.906 ] 00:20:59.906 }' 00:20:59.906 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:59.906 [2024-06-10 10:16:21.762165] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:00.166 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:00.166 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:00.166 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:00.166 10:16:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:00.166 [2024-06-10 10:16:21.884745] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:00.166 [2024-06-10 10:16:21.884887] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:00.166 [2024-06-10 10:16:22.011870] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:00.424 [2024-06-10 10:16:22.114775] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:00.424 [2024-06-10 10:16:22.128937] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:00.424 [2024-06-10 10:16:22.128954] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:00.424 [2024-06-10 10:16:22.128960] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:00.424 [2024-06-10 10:16:22.152244] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2616170 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.424 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.425 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.683 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.683 "name": "raid_bdev1", 00:21:00.683 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:00.683 "strip_size_kb": 0, 00:21:00.683 "state": "online", 00:21:00.683 "raid_level": "raid1", 00:21:00.683 "superblock": false, 00:21:00.683 "num_base_bdevs": 2, 00:21:00.683 "num_base_bdevs_discovered": 1, 00:21:00.683 "num_base_bdevs_operational": 1, 00:21:00.683 "base_bdevs_list": [ 00:21:00.683 { 00:21:00.683 "name": null, 00:21:00.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.683 "is_configured": false, 00:21:00.683 "data_offset": 0, 00:21:00.683 "data_size": 65536 00:21:00.683 }, 00:21:00.683 { 00:21:00.683 "name": "BaseBdev2", 00:21:00.683 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:00.683 "is_configured": true, 00:21:00.683 "data_offset": 0, 00:21:00.683 "data_size": 65536 00:21:00.683 } 00:21:00.683 ] 00:21:00.683 }' 00:21:00.683 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.683 10:16:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.253 10:16:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:01.513 "name": "raid_bdev1", 00:21:01.513 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:01.513 "strip_size_kb": 0, 00:21:01.513 "state": "online", 00:21:01.513 "raid_level": "raid1", 00:21:01.513 "superblock": false, 00:21:01.513 "num_base_bdevs": 2, 00:21:01.513 "num_base_bdevs_discovered": 1, 00:21:01.513 "num_base_bdevs_operational": 1, 00:21:01.513 "base_bdevs_list": [ 00:21:01.513 { 00:21:01.513 "name": null, 00:21:01.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.513 "is_configured": false, 00:21:01.513 "data_offset": 0, 00:21:01.513 "data_size": 65536 00:21:01.513 }, 00:21:01.513 { 00:21:01.513 "name": "BaseBdev2", 00:21:01.513 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:01.513 "is_configured": true, 00:21:01.513 "data_offset": 0, 00:21:01.513 "data_size": 65536 00:21:01.513 } 00:21:01.513 ] 00:21:01.513 }' 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:01.513 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:01.513 [2024-06-10 10:16:23.365188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:01.773 10:16:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:01.773 [2024-06-10 10:16:23.398227] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2610340 00:21:01.773 [2024-06-10 10:16:23.399359] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:01.773 [2024-06-10 10:16:23.507639] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:01.773 [2024-06-10 10:16:23.507857] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:02.033 [2024-06-10 10:16:23.709451] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:02.033 [2024-06-10 10:16:23.709556] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:02.293 [2024-06-10 10:16:24.027294] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:02.293 [2024-06-10 10:16:24.149535] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.553 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.813 [2024-06-10 10:16:24.483220] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:02.813 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:02.813 "name": "raid_bdev1", 00:21:02.813 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:02.813 "strip_size_kb": 0, 00:21:02.813 "state": "online", 00:21:02.813 "raid_level": "raid1", 00:21:02.813 "superblock": false, 00:21:02.813 "num_base_bdevs": 2, 00:21:02.813 "num_base_bdevs_discovered": 2, 00:21:02.813 "num_base_bdevs_operational": 2, 00:21:02.813 "process": { 00:21:02.813 "type": "rebuild", 00:21:02.813 "target": "spare", 00:21:02.813 "progress": { 00:21:02.813 "blocks": 14336, 00:21:02.813 "percent": 21 00:21:02.813 } 00:21:02.813 }, 00:21:02.813 "base_bdevs_list": [ 00:21:02.813 { 00:21:02.813 "name": "spare", 00:21:02.813 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:02.813 "is_configured": true, 00:21:02.813 "data_offset": 0, 00:21:02.813 "data_size": 65536 00:21:02.813 }, 00:21:02.813 { 00:21:02.813 "name": "BaseBdev2", 00:21:02.813 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:02.813 "is_configured": true, 00:21:02.813 "data_offset": 0, 00:21:02.813 "data_size": 65536 00:21:02.813 } 00:21:02.813 ] 00:21:02.813 }' 00:21:02.813 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:02.813 [2024-06-10 10:16:24.612088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:02.813 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:02.813 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=676 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:03.073 "name": "raid_bdev1", 00:21:03.073 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:03.073 "strip_size_kb": 0, 00:21:03.073 "state": "online", 00:21:03.073 "raid_level": "raid1", 00:21:03.073 "superblock": false, 00:21:03.073 "num_base_bdevs": 2, 00:21:03.073 "num_base_bdevs_discovered": 2, 00:21:03.073 "num_base_bdevs_operational": 2, 00:21:03.073 "process": { 00:21:03.073 "type": "rebuild", 00:21:03.073 "target": "spare", 00:21:03.073 "progress": { 00:21:03.073 "blocks": 20480, 00:21:03.073 "percent": 31 00:21:03.073 } 00:21:03.073 }, 00:21:03.073 "base_bdevs_list": [ 00:21:03.073 { 00:21:03.073 "name": "spare", 00:21:03.073 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:03.073 "is_configured": true, 00:21:03.073 "data_offset": 0, 00:21:03.073 "data_size": 65536 00:21:03.073 }, 00:21:03.073 { 00:21:03.073 "name": "BaseBdev2", 00:21:03.073 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:03.073 "is_configured": true, 00:21:03.073 "data_offset": 0, 00:21:03.073 "data_size": 65536 00:21:03.073 } 00:21:03.073 ] 00:21:03.073 }' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:03.073 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:03.332 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:03.332 10:16:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:03.332 [2024-06-10 10:16:25.164949] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:03.592 [2024-06-10 10:16:25.387176] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:04.163 [2024-06-10 10:16:25.730163] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:04.163 [2024-06-10 10:16:25.944450] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:04.163 [2024-06-10 10:16:25.944585] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.163 10:16:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:04.423 "name": "raid_bdev1", 00:21:04.423 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:04.423 "strip_size_kb": 0, 00:21:04.423 "state": "online", 00:21:04.423 "raid_level": "raid1", 00:21:04.423 "superblock": false, 00:21:04.423 "num_base_bdevs": 2, 00:21:04.423 "num_base_bdevs_discovered": 2, 00:21:04.423 "num_base_bdevs_operational": 2, 00:21:04.423 "process": { 00:21:04.423 "type": "rebuild", 00:21:04.423 "target": "spare", 00:21:04.423 "progress": { 00:21:04.423 "blocks": 34816, 00:21:04.423 "percent": 53 00:21:04.423 } 00:21:04.423 }, 00:21:04.423 "base_bdevs_list": [ 00:21:04.423 { 00:21:04.423 "name": "spare", 00:21:04.423 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:04.423 "is_configured": true, 00:21:04.423 "data_offset": 0, 00:21:04.423 "data_size": 65536 00:21:04.423 }, 00:21:04.423 { 00:21:04.423 "name": "BaseBdev2", 00:21:04.423 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:04.423 "is_configured": true, 00:21:04.423 "data_offset": 0, 00:21:04.423 "data_size": 65536 00:21:04.423 } 00:21:04.423 ] 00:21:04.423 }' 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:04.423 10:16:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:04.682 [2024-06-10 10:16:26.390261] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:04.682 [2024-06-10 10:16:26.390393] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:04.942 [2024-06-10 10:16:26.727845] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:21:05.514 [2024-06-10 10:16:27.165761] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.514 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:05.774 "name": "raid_bdev1", 00:21:05.774 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:05.774 "strip_size_kb": 0, 00:21:05.774 "state": "online", 00:21:05.774 "raid_level": "raid1", 00:21:05.774 "superblock": false, 00:21:05.774 "num_base_bdevs": 2, 00:21:05.774 "num_base_bdevs_discovered": 2, 00:21:05.774 "num_base_bdevs_operational": 2, 00:21:05.774 "process": { 00:21:05.774 "type": "rebuild", 00:21:05.774 "target": "spare", 00:21:05.774 "progress": { 00:21:05.774 "blocks": 57344, 00:21:05.774 "percent": 87 00:21:05.774 } 00:21:05.774 }, 00:21:05.774 "base_bdevs_list": [ 00:21:05.774 { 00:21:05.774 "name": "spare", 00:21:05.774 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:05.774 "is_configured": true, 00:21:05.774 "data_offset": 0, 00:21:05.774 "data_size": 65536 00:21:05.774 }, 00:21:05.774 { 00:21:05.774 "name": "BaseBdev2", 00:21:05.774 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:05.774 "is_configured": true, 00:21:05.774 "data_offset": 0, 00:21:05.774 "data_size": 65536 00:21:05.774 } 00:21:05.774 ] 00:21:05.774 }' 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:05.774 10:16:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:06.064 [2024-06-10 10:16:27.819116] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:06.064 [2024-06-10 10:16:27.925664] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:06.064 [2024-06-10 10:16:27.926803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.005 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:07.005 "name": "raid_bdev1", 00:21:07.005 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:07.005 "strip_size_kb": 0, 00:21:07.005 "state": "online", 00:21:07.005 "raid_level": "raid1", 00:21:07.005 "superblock": false, 00:21:07.005 "num_base_bdevs": 2, 00:21:07.005 "num_base_bdevs_discovered": 2, 00:21:07.005 "num_base_bdevs_operational": 2, 00:21:07.005 "base_bdevs_list": [ 00:21:07.005 { 00:21:07.005 "name": "spare", 00:21:07.005 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:07.006 "is_configured": true, 00:21:07.006 "data_offset": 0, 00:21:07.006 "data_size": 65536 00:21:07.006 }, 00:21:07.006 { 00:21:07.006 "name": "BaseBdev2", 00:21:07.006 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:07.006 "is_configured": true, 00:21:07.006 "data_offset": 0, 00:21:07.006 "data_size": 65536 00:21:07.006 } 00:21:07.006 ] 00:21:07.006 }' 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.006 10:16:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:07.266 "name": "raid_bdev1", 00:21:07.266 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:07.266 "strip_size_kb": 0, 00:21:07.266 "state": "online", 00:21:07.266 "raid_level": "raid1", 00:21:07.266 "superblock": false, 00:21:07.266 "num_base_bdevs": 2, 00:21:07.266 "num_base_bdevs_discovered": 2, 00:21:07.266 "num_base_bdevs_operational": 2, 00:21:07.266 "base_bdevs_list": [ 00:21:07.266 { 00:21:07.266 "name": "spare", 00:21:07.266 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:07.266 "is_configured": true, 00:21:07.266 "data_offset": 0, 00:21:07.266 "data_size": 65536 00:21:07.266 }, 00:21:07.266 { 00:21:07.266 "name": "BaseBdev2", 00:21:07.266 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:07.266 "is_configured": true, 00:21:07.266 "data_offset": 0, 00:21:07.266 "data_size": 65536 00:21:07.266 } 00:21:07.266 ] 00:21:07.266 }' 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.266 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.527 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.527 "name": "raid_bdev1", 00:21:07.527 "uuid": "08a9ad03-2875-416d-a8cd-6ffd71f7b367", 00:21:07.527 "strip_size_kb": 0, 00:21:07.527 "state": "online", 00:21:07.527 "raid_level": "raid1", 00:21:07.527 "superblock": false, 00:21:07.527 "num_base_bdevs": 2, 00:21:07.527 "num_base_bdevs_discovered": 2, 00:21:07.527 "num_base_bdevs_operational": 2, 00:21:07.527 "base_bdevs_list": [ 00:21:07.527 { 00:21:07.527 "name": "spare", 00:21:07.527 "uuid": "79314623-ea5c-52d1-88b3-6f9fedab7c2a", 00:21:07.527 "is_configured": true, 00:21:07.527 "data_offset": 0, 00:21:07.527 "data_size": 65536 00:21:07.527 }, 00:21:07.527 { 00:21:07.527 "name": "BaseBdev2", 00:21:07.527 "uuid": "1340fa89-eb8e-58ac-9f7c-a66b483b7dcb", 00:21:07.527 "is_configured": true, 00:21:07.527 "data_offset": 0, 00:21:07.527 "data_size": 65536 00:21:07.527 } 00:21:07.527 ] 00:21:07.527 }' 00:21:07.527 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.527 10:16:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:08.097 10:16:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:08.358 [2024-06-10 10:16:30.045607] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:08.358 [2024-06-10 10:16:30.045631] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:08.358 00:21:08.358 Latency(us) 00:21:08.358 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:08.358 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:08.358 raid_bdev1 : 10.61 107.57 322.72 0.00 0.00 12036.12 242.61 115343.36 00:21:08.358 =================================================================================================================== 00:21:08.358 Total : 107.57 322.72 0.00 0.00 12036.12 242.61 115343.36 00:21:08.358 [2024-06-10 10:16:30.085011] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:08.358 [2024-06-10 10:16:30.085035] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:08.358 [2024-06-10 10:16:30.085091] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:08.358 [2024-06-10 10:16:30.085098] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2613960 name raid_bdev1, state offline 00:21:08.358 0 00:21:08.358 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.358 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.630 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:08.630 /dev/nbd0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:08.895 1+0 records in 00:21:08.895 1+0 records out 00:21:08.895 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281649 s, 14.5 MB/s 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:08.895 /dev/nbd1 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:08.895 1+0 records in 00:21:08.895 1+0 records out 00:21:08.895 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026379 s, 15.5 MB/s 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.895 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:09.156 10:16:30 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:09.156 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1078405 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 1078405 ']' 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 1078405 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1078405 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1078405' 00:21:09.416 killing process with pid 1078405 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 1078405 00:21:09.416 Received shutdown signal, test time was about 11.763056 seconds 00:21:09.416 00:21:09.416 Latency(us) 00:21:09.416 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:09.416 =================================================================================================================== 00:21:09.416 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:09.416 [2024-06-10 10:16:31.240752] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:09.416 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 1078405 00:21:09.416 [2024-06-10 10:16:31.252166] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:09.678 00:21:09.678 real 0m15.513s 00:21:09.678 user 0m23.747s 00:21:09.678 sys 0m1.847s 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:09.678 ************************************ 00:21:09.678 END TEST raid_rebuild_test_io 00:21:09.678 ************************************ 00:21:09.678 10:16:31 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:21:09.678 10:16:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:21:09.678 10:16:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:09.678 10:16:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:09.678 ************************************ 00:21:09.678 START TEST raid_rebuild_test_sb_io 00:21:09.678 ************************************ 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true true true 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1081303 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1081303 /var/tmp/spdk-raid.sock 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 1081303 ']' 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:09.678 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:09.678 10:16:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:09.678 [2024-06-10 10:16:31.511938] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:21:09.678 [2024-06-10 10:16:31.511984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1081303 ] 00:21:09.678 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:09.678 Zero copy mechanism will not be used. 00:21:09.939 [2024-06-10 10:16:31.600192] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:09.939 [2024-06-10 10:16:31.670577] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:09.939 [2024-06-10 10:16:31.716515] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.939 [2024-06-10 10:16:31.716541] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:10.511 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:10.511 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:21:10.511 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:10.511 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:10.772 BaseBdev1_malloc 00:21:10.772 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:11.032 [2024-06-10 10:16:32.695360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:11.032 [2024-06-10 10:16:32.695395] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.032 [2024-06-10 10:16:32.695407] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1733a50 00:21:11.032 [2024-06-10 10:16:32.695414] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.032 [2024-06-10 10:16:32.696687] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.032 [2024-06-10 10:16:32.696706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:11.032 BaseBdev1 00:21:11.032 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:11.032 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:11.032 BaseBdev2_malloc 00:21:11.032 10:16:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:11.293 [2024-06-10 10:16:33.058037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:11.293 [2024-06-10 10:16:33.058065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.293 [2024-06-10 10:16:33.058077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17345a0 00:21:11.293 [2024-06-10 10:16:33.058084] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.293 [2024-06-10 10:16:33.059226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.293 [2024-06-10 10:16:33.059244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:11.293 BaseBdev2 00:21:11.293 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:11.553 spare_malloc 00:21:11.553 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:11.814 spare_delay 00:21:11.814 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:11.814 [2024-06-10 10:16:33.621209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:11.814 [2024-06-10 10:16:33.621236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.814 [2024-06-10 10:16:33.621246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e2450 00:21:11.814 [2024-06-10 10:16:33.621253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.814 [2024-06-10 10:16:33.622452] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.814 [2024-06-10 10:16:33.622470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:11.814 spare 00:21:11.814 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:12.075 [2024-06-10 10:16:33.801684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:12.075 [2024-06-10 10:16:33.802646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:12.075 [2024-06-10 10:16:33.802762] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e1960 00:21:12.075 [2024-06-10 10:16:33.802770] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:12.075 [2024-06-10 10:16:33.802914] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1732530 00:21:12.075 [2024-06-10 10:16:33.803020] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e1960 00:21:12.075 [2024-06-10 10:16:33.803026] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e1960 00:21:12.075 [2024-06-10 10:16:33.803092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.075 10:16:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.338 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.338 "name": "raid_bdev1", 00:21:12.338 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:12.338 "strip_size_kb": 0, 00:21:12.338 "state": "online", 00:21:12.338 "raid_level": "raid1", 00:21:12.338 "superblock": true, 00:21:12.338 "num_base_bdevs": 2, 00:21:12.338 "num_base_bdevs_discovered": 2, 00:21:12.338 "num_base_bdevs_operational": 2, 00:21:12.338 "base_bdevs_list": [ 00:21:12.338 { 00:21:12.338 "name": "BaseBdev1", 00:21:12.338 "uuid": "7e370c0c-390a-5fbc-84b5-79cad2c983b9", 00:21:12.338 "is_configured": true, 00:21:12.338 "data_offset": 2048, 00:21:12.338 "data_size": 63488 00:21:12.338 }, 00:21:12.338 { 00:21:12.338 "name": "BaseBdev2", 00:21:12.338 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:12.338 "is_configured": true, 00:21:12.338 "data_offset": 2048, 00:21:12.338 "data_size": 63488 00:21:12.338 } 00:21:12.338 ] 00:21:12.338 }' 00:21:12.338 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.338 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:12.908 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:12.908 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:12.908 [2024-06-10 10:16:34.716152] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:12.908 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:12.908 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.908 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:13.168 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:13.168 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:13.168 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:13.168 10:16:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:13.168 [2024-06-10 10:16:35.018190] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e5240 00:21:13.168 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:13.168 Zero copy mechanism will not be used. 00:21:13.168 Running I/O for 60 seconds... 00:21:13.428 [2024-06-10 10:16:35.103498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:13.428 [2024-06-10 10:16:35.116532] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18e5240 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.428 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.689 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.689 "name": "raid_bdev1", 00:21:13.689 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:13.689 "strip_size_kb": 0, 00:21:13.689 "state": "online", 00:21:13.689 "raid_level": "raid1", 00:21:13.689 "superblock": true, 00:21:13.689 "num_base_bdevs": 2, 00:21:13.689 "num_base_bdevs_discovered": 1, 00:21:13.689 "num_base_bdevs_operational": 1, 00:21:13.689 "base_bdevs_list": [ 00:21:13.689 { 00:21:13.689 "name": null, 00:21:13.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.689 "is_configured": false, 00:21:13.689 "data_offset": 2048, 00:21:13.689 "data_size": 63488 00:21:13.689 }, 00:21:13.689 { 00:21:13.689 "name": "BaseBdev2", 00:21:13.689 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:13.689 "is_configured": true, 00:21:13.689 "data_offset": 2048, 00:21:13.689 "data_size": 63488 00:21:13.689 } 00:21:13.689 ] 00:21:13.689 }' 00:21:13.689 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.689 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:14.259 10:16:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:14.259 [2024-06-10 10:16:36.074432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:14.539 [2024-06-10 10:16:36.126952] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18673f0 00:21:14.539 10:16:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:14.539 [2024-06-10 10:16:36.128558] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:14.539 [2024-06-10 10:16:36.256140] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:14.539 [2024-06-10 10:16:36.387671] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:14.539 [2024-06-10 10:16:36.387805] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:15.109 [2024-06-10 10:16:36.725588] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:15.109 [2024-06-10 10:16:36.947410] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:15.109 [2024-06-10 10:16:36.947518] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.369 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.630 [2024-06-10 10:16:37.305157] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:15.630 [2024-06-10 10:16:37.311779] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:15.630 "name": "raid_bdev1", 00:21:15.630 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:15.630 "strip_size_kb": 0, 00:21:15.630 "state": "online", 00:21:15.630 "raid_level": "raid1", 00:21:15.630 "superblock": true, 00:21:15.630 "num_base_bdevs": 2, 00:21:15.630 "num_base_bdevs_discovered": 2, 00:21:15.630 "num_base_bdevs_operational": 2, 00:21:15.630 "process": { 00:21:15.630 "type": "rebuild", 00:21:15.630 "target": "spare", 00:21:15.630 "progress": { 00:21:15.630 "blocks": 16384, 00:21:15.630 "percent": 25 00:21:15.630 } 00:21:15.630 }, 00:21:15.630 "base_bdevs_list": [ 00:21:15.630 { 00:21:15.630 "name": "spare", 00:21:15.630 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:15.630 "is_configured": true, 00:21:15.630 "data_offset": 2048, 00:21:15.630 "data_size": 63488 00:21:15.630 }, 00:21:15.630 { 00:21:15.630 "name": "BaseBdev2", 00:21:15.630 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:15.630 "is_configured": true, 00:21:15.630 "data_offset": 2048, 00:21:15.630 "data_size": 63488 00:21:15.630 } 00:21:15.630 ] 00:21:15.630 }' 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:15.630 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:15.892 [2024-06-10 10:16:37.577362] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:15.892 [2024-06-10 10:16:37.622983] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:15.892 [2024-06-10 10:16:37.723761] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:15.892 [2024-06-10 10:16:37.731596] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:15.892 [2024-06-10 10:16:37.731615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:15.892 [2024-06-10 10:16:37.731620] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:15.892 [2024-06-10 10:16:37.754888] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18e5240 00:21:16.152 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:16.152 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.153 10:16:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.153 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.153 "name": "raid_bdev1", 00:21:16.153 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:16.153 "strip_size_kb": 0, 00:21:16.153 "state": "online", 00:21:16.153 "raid_level": "raid1", 00:21:16.153 "superblock": true, 00:21:16.153 "num_base_bdevs": 2, 00:21:16.153 "num_base_bdevs_discovered": 1, 00:21:16.153 "num_base_bdevs_operational": 1, 00:21:16.153 "base_bdevs_list": [ 00:21:16.153 { 00:21:16.153 "name": null, 00:21:16.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.153 "is_configured": false, 00:21:16.153 "data_offset": 2048, 00:21:16.153 "data_size": 63488 00:21:16.153 }, 00:21:16.153 { 00:21:16.153 "name": "BaseBdev2", 00:21:16.153 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:16.153 "is_configured": true, 00:21:16.153 "data_offset": 2048, 00:21:16.153 "data_size": 63488 00:21:16.153 } 00:21:16.153 ] 00:21:16.153 }' 00:21:16.153 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.153 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.724 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.984 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:16.984 "name": "raid_bdev1", 00:21:16.984 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:16.984 "strip_size_kb": 0, 00:21:16.984 "state": "online", 00:21:16.984 "raid_level": "raid1", 00:21:16.984 "superblock": true, 00:21:16.984 "num_base_bdevs": 2, 00:21:16.984 "num_base_bdevs_discovered": 1, 00:21:16.984 "num_base_bdevs_operational": 1, 00:21:16.984 "base_bdevs_list": [ 00:21:16.984 { 00:21:16.984 "name": null, 00:21:16.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.984 "is_configured": false, 00:21:16.984 "data_offset": 2048, 00:21:16.984 "data_size": 63488 00:21:16.984 }, 00:21:16.984 { 00:21:16.985 "name": "BaseBdev2", 00:21:16.985 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:16.985 "is_configured": true, 00:21:16.985 "data_offset": 2048, 00:21:16.985 "data_size": 63488 00:21:16.985 } 00:21:16.985 ] 00:21:16.985 }' 00:21:16.985 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:16.985 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:16.985 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:16.985 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:16.985 10:16:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:17.245 [2024-06-10 10:16:39.015960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:17.245 10:16:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:17.245 [2024-06-10 10:16:39.068489] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186acc0 00:21:17.245 [2024-06-10 10:16:39.069616] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:17.504 [2024-06-10 10:16:39.190953] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:17.504 [2024-06-10 10:16:39.191139] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:17.764 [2024-06-10 10:16:39.405281] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:17.764 [2024-06-10 10:16:39.405392] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:18.024 [2024-06-10 10:16:39.649656] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:18.024 [2024-06-10 10:16:39.878299] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.298 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.298 [2024-06-10 10:16:40.093620] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.564 "name": "raid_bdev1", 00:21:18.564 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:18.564 "strip_size_kb": 0, 00:21:18.564 "state": "online", 00:21:18.564 "raid_level": "raid1", 00:21:18.564 "superblock": true, 00:21:18.564 "num_base_bdevs": 2, 00:21:18.564 "num_base_bdevs_discovered": 2, 00:21:18.564 "num_base_bdevs_operational": 2, 00:21:18.564 "process": { 00:21:18.564 "type": "rebuild", 00:21:18.564 "target": "spare", 00:21:18.564 "progress": { 00:21:18.564 "blocks": 14336, 00:21:18.564 "percent": 22 00:21:18.564 } 00:21:18.564 }, 00:21:18.564 "base_bdevs_list": [ 00:21:18.564 { 00:21:18.564 "name": "spare", 00:21:18.564 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:18.564 "is_configured": true, 00:21:18.564 "data_offset": 2048, 00:21:18.564 "data_size": 63488 00:21:18.564 }, 00:21:18.564 { 00:21:18.564 "name": "BaseBdev2", 00:21:18.564 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:18.564 "is_configured": true, 00:21:18.564 "data_offset": 2048, 00:21:18.564 "data_size": 63488 00:21:18.564 } 00:21:18.564 ] 00:21:18.564 }' 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.564 [2024-06-10 10:16:40.302405] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:18.564 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:18.564 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=692 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.565 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.826 "name": "raid_bdev1", 00:21:18.826 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:18.826 "strip_size_kb": 0, 00:21:18.826 "state": "online", 00:21:18.826 "raid_level": "raid1", 00:21:18.826 "superblock": true, 00:21:18.826 "num_base_bdevs": 2, 00:21:18.826 "num_base_bdevs_discovered": 2, 00:21:18.826 "num_base_bdevs_operational": 2, 00:21:18.826 "process": { 00:21:18.826 "type": "rebuild", 00:21:18.826 "target": "spare", 00:21:18.826 "progress": { 00:21:18.826 "blocks": 18432, 00:21:18.826 "percent": 29 00:21:18.826 } 00:21:18.826 }, 00:21:18.826 "base_bdevs_list": [ 00:21:18.826 { 00:21:18.826 "name": "spare", 00:21:18.826 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:18.826 "is_configured": true, 00:21:18.826 "data_offset": 2048, 00:21:18.826 "data_size": 63488 00:21:18.826 }, 00:21:18.826 { 00:21:18.826 "name": "BaseBdev2", 00:21:18.826 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:18.826 "is_configured": true, 00:21:18.826 "data_offset": 2048, 00:21:18.826 "data_size": 63488 00:21:18.826 } 00:21:18.826 ] 00:21:18.826 }' 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.826 10:16:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:19.086 [2024-06-10 10:16:40.740700] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:19.347 [2024-06-10 10:16:40.985044] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:19.607 [2024-06-10 10:16:41.314186] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.867 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.128 [2024-06-10 10:16:41.758172] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:20.128 "name": "raid_bdev1", 00:21:20.128 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:20.128 "strip_size_kb": 0, 00:21:20.128 "state": "online", 00:21:20.128 "raid_level": "raid1", 00:21:20.128 "superblock": true, 00:21:20.128 "num_base_bdevs": 2, 00:21:20.128 "num_base_bdevs_discovered": 2, 00:21:20.128 "num_base_bdevs_operational": 2, 00:21:20.128 "process": { 00:21:20.128 "type": "rebuild", 00:21:20.128 "target": "spare", 00:21:20.128 "progress": { 00:21:20.128 "blocks": 38912, 00:21:20.128 "percent": 61 00:21:20.128 } 00:21:20.128 }, 00:21:20.128 "base_bdevs_list": [ 00:21:20.128 { 00:21:20.128 "name": "spare", 00:21:20.128 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:20.128 "is_configured": true, 00:21:20.128 "data_offset": 2048, 00:21:20.128 "data_size": 63488 00:21:20.128 }, 00:21:20.128 { 00:21:20.128 "name": "BaseBdev2", 00:21:20.128 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:20.128 "is_configured": true, 00:21:20.128 "data_offset": 2048, 00:21:20.128 "data_size": 63488 00:21:20.128 } 00:21:20.128 ] 00:21:20.128 }' 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.128 10:16:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:20.698 [2024-06-10 10:16:42.417452] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.269 10:16:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.530 "name": "raid_bdev1", 00:21:21.530 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:21.530 "strip_size_kb": 0, 00:21:21.530 "state": "online", 00:21:21.530 "raid_level": "raid1", 00:21:21.530 "superblock": true, 00:21:21.530 "num_base_bdevs": 2, 00:21:21.530 "num_base_bdevs_discovered": 2, 00:21:21.530 "num_base_bdevs_operational": 2, 00:21:21.530 "process": { 00:21:21.530 "type": "rebuild", 00:21:21.530 "target": "spare", 00:21:21.530 "progress": { 00:21:21.530 "blocks": 61440, 00:21:21.530 "percent": 96 00:21:21.530 } 00:21:21.530 }, 00:21:21.530 "base_bdevs_list": [ 00:21:21.530 { 00:21:21.530 "name": "spare", 00:21:21.530 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:21.530 "is_configured": true, 00:21:21.530 "data_offset": 2048, 00:21:21.530 "data_size": 63488 00:21:21.530 }, 00:21:21.530 { 00:21:21.530 "name": "BaseBdev2", 00:21:21.530 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:21.530 "is_configured": true, 00:21:21.530 "data_offset": 2048, 00:21:21.530 "data_size": 63488 00:21:21.530 } 00:21:21.530 ] 00:21:21.530 }' 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:21.530 10:16:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:21.530 [2024-06-10 10:16:43.210391] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:21.530 [2024-06-10 10:16:43.315570] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:21.530 [2024-06-10 10:16:43.316675] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.473 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:22.734 "name": "raid_bdev1", 00:21:22.734 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:22.734 "strip_size_kb": 0, 00:21:22.734 "state": "online", 00:21:22.734 "raid_level": "raid1", 00:21:22.734 "superblock": true, 00:21:22.734 "num_base_bdevs": 2, 00:21:22.734 "num_base_bdevs_discovered": 2, 00:21:22.734 "num_base_bdevs_operational": 2, 00:21:22.734 "base_bdevs_list": [ 00:21:22.734 { 00:21:22.734 "name": "spare", 00:21:22.734 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:22.734 "is_configured": true, 00:21:22.734 "data_offset": 2048, 00:21:22.734 "data_size": 63488 00:21:22.734 }, 00:21:22.734 { 00:21:22.734 "name": "BaseBdev2", 00:21:22.734 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:22.734 "is_configured": true, 00:21:22.734 "data_offset": 2048, 00:21:22.734 "data_size": 63488 00:21:22.734 } 00:21:22.734 ] 00:21:22.734 }' 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.734 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:22.995 "name": "raid_bdev1", 00:21:22.995 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:22.995 "strip_size_kb": 0, 00:21:22.995 "state": "online", 00:21:22.995 "raid_level": "raid1", 00:21:22.995 "superblock": true, 00:21:22.995 "num_base_bdevs": 2, 00:21:22.995 "num_base_bdevs_discovered": 2, 00:21:22.995 "num_base_bdevs_operational": 2, 00:21:22.995 "base_bdevs_list": [ 00:21:22.995 { 00:21:22.995 "name": "spare", 00:21:22.995 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:22.995 "is_configured": true, 00:21:22.995 "data_offset": 2048, 00:21:22.995 "data_size": 63488 00:21:22.995 }, 00:21:22.995 { 00:21:22.995 "name": "BaseBdev2", 00:21:22.995 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:22.995 "is_configured": true, 00:21:22.995 "data_offset": 2048, 00:21:22.995 "data_size": 63488 00:21:22.995 } 00:21:22.995 ] 00:21:22.995 }' 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.995 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.256 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.256 "name": "raid_bdev1", 00:21:23.256 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:23.256 "strip_size_kb": 0, 00:21:23.256 "state": "online", 00:21:23.256 "raid_level": "raid1", 00:21:23.256 "superblock": true, 00:21:23.256 "num_base_bdevs": 2, 00:21:23.256 "num_base_bdevs_discovered": 2, 00:21:23.256 "num_base_bdevs_operational": 2, 00:21:23.256 "base_bdevs_list": [ 00:21:23.256 { 00:21:23.256 "name": "spare", 00:21:23.256 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:23.256 "is_configured": true, 00:21:23.256 "data_offset": 2048, 00:21:23.256 "data_size": 63488 00:21:23.256 }, 00:21:23.256 { 00:21:23.256 "name": "BaseBdev2", 00:21:23.256 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:23.256 "is_configured": true, 00:21:23.256 "data_offset": 2048, 00:21:23.256 "data_size": 63488 00:21:23.256 } 00:21:23.256 ] 00:21:23.256 }' 00:21:23.256 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.256 10:16:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:23.828 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:23.828 [2024-06-10 10:16:45.663291] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:23.828 [2024-06-10 10:16:45.663312] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:23.828 00:21:23.828 Latency(us) 00:21:23.828 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:23.828 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:23.828 raid_bdev1 : 10.64 112.00 336.01 0.00 0.00 12692.32 245.76 114536.76 00:21:23.828 =================================================================================================================== 00:21:23.828 Total : 112.00 336.01 0.00 0.00 12692.32 245.76 114536.76 00:21:23.828 [2024-06-10 10:16:45.690532] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:23.828 [2024-06-10 10:16:45.690556] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.828 [2024-06-10 10:16:45.690613] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.828 [2024-06-10 10:16:45.690619] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e1960 name raid_bdev1, state offline 00:21:23.828 0 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:24.089 10:16:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:24.351 /dev/nbd0 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:24.351 1+0 records in 00:21:24.351 1+0 records out 00:21:24.351 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232052 s, 17.7 MB/s 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:24.351 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:24.612 /dev/nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:24.612 1+0 records in 00:21:24.612 1+0 records out 00:21:24.612 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000204776 s, 20.0 MB/s 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:24.612 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:24.873 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:25.134 10:16:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:25.396 [2024-06-10 10:16:47.224693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:25.396 [2024-06-10 10:16:47.224728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.396 [2024-06-10 10:16:47.224741] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18df2c0 00:21:25.396 [2024-06-10 10:16:47.224747] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.396 [2024-06-10 10:16:47.226123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.396 [2024-06-10 10:16:47.226146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:25.396 [2024-06-10 10:16:47.226217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:25.396 [2024-06-10 10:16:47.226237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:25.396 [2024-06-10 10:16:47.226317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:25.396 spare 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.396 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.657 [2024-06-10 10:16:47.326609] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17463b0 00:21:25.657 [2024-06-10 10:16:47.326617] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:25.657 [2024-06-10 10:16:47.326776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1732520 00:21:25.657 [2024-06-10 10:16:47.326896] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17463b0 00:21:25.657 [2024-06-10 10:16:47.326902] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17463b0 00:21:25.657 [2024-06-10 10:16:47.326984] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.657 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.657 "name": "raid_bdev1", 00:21:25.657 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:25.657 "strip_size_kb": 0, 00:21:25.657 "state": "online", 00:21:25.657 "raid_level": "raid1", 00:21:25.657 "superblock": true, 00:21:25.657 "num_base_bdevs": 2, 00:21:25.657 "num_base_bdevs_discovered": 2, 00:21:25.657 "num_base_bdevs_operational": 2, 00:21:25.657 "base_bdevs_list": [ 00:21:25.657 { 00:21:25.657 "name": "spare", 00:21:25.657 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:25.657 "is_configured": true, 00:21:25.657 "data_offset": 2048, 00:21:25.657 "data_size": 63488 00:21:25.657 }, 00:21:25.657 { 00:21:25.657 "name": "BaseBdev2", 00:21:25.657 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:25.657 "is_configured": true, 00:21:25.657 "data_offset": 2048, 00:21:25.657 "data_size": 63488 00:21:25.657 } 00:21:25.657 ] 00:21:25.657 }' 00:21:25.657 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.657 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:26.228 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.229 10:16:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:26.489 "name": "raid_bdev1", 00:21:26.489 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:26.489 "strip_size_kb": 0, 00:21:26.489 "state": "online", 00:21:26.489 "raid_level": "raid1", 00:21:26.489 "superblock": true, 00:21:26.489 "num_base_bdevs": 2, 00:21:26.489 "num_base_bdevs_discovered": 2, 00:21:26.489 "num_base_bdevs_operational": 2, 00:21:26.489 "base_bdevs_list": [ 00:21:26.489 { 00:21:26.489 "name": "spare", 00:21:26.489 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:26.489 "is_configured": true, 00:21:26.489 "data_offset": 2048, 00:21:26.489 "data_size": 63488 00:21:26.489 }, 00:21:26.489 { 00:21:26.489 "name": "BaseBdev2", 00:21:26.489 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:26.489 "is_configured": true, 00:21:26.489 "data_offset": 2048, 00:21:26.489 "data_size": 63488 00:21:26.489 } 00:21:26.489 ] 00:21:26.489 }' 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.489 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:26.780 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:26.780 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:26.780 [2024-06-10 10:16:48.640517] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.040 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.041 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.041 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.041 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.041 "name": "raid_bdev1", 00:21:27.041 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:27.041 "strip_size_kb": 0, 00:21:27.041 "state": "online", 00:21:27.041 "raid_level": "raid1", 00:21:27.041 "superblock": true, 00:21:27.041 "num_base_bdevs": 2, 00:21:27.041 "num_base_bdevs_discovered": 1, 00:21:27.041 "num_base_bdevs_operational": 1, 00:21:27.041 "base_bdevs_list": [ 00:21:27.041 { 00:21:27.041 "name": null, 00:21:27.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.041 "is_configured": false, 00:21:27.041 "data_offset": 2048, 00:21:27.041 "data_size": 63488 00:21:27.041 }, 00:21:27.041 { 00:21:27.041 "name": "BaseBdev2", 00:21:27.041 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:27.041 "is_configured": true, 00:21:27.041 "data_offset": 2048, 00:21:27.041 "data_size": 63488 00:21:27.041 } 00:21:27.041 ] 00:21:27.041 }' 00:21:27.041 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.041 10:16:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:27.611 10:16:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:27.871 [2024-06-10 10:16:49.542916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:27.871 [2024-06-10 10:16:49.543038] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:27.871 [2024-06-10 10:16:49.543048] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:27.871 [2024-06-10 10:16:49.543068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:27.871 [2024-06-10 10:16:49.546724] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17469a0 00:21:27.871 [2024-06-10 10:16:49.548339] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:27.871 10:16:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.810 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:29.070 "name": "raid_bdev1", 00:21:29.070 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:29.070 "strip_size_kb": 0, 00:21:29.070 "state": "online", 00:21:29.070 "raid_level": "raid1", 00:21:29.070 "superblock": true, 00:21:29.070 "num_base_bdevs": 2, 00:21:29.070 "num_base_bdevs_discovered": 2, 00:21:29.070 "num_base_bdevs_operational": 2, 00:21:29.070 "process": { 00:21:29.070 "type": "rebuild", 00:21:29.070 "target": "spare", 00:21:29.070 "progress": { 00:21:29.070 "blocks": 22528, 00:21:29.070 "percent": 35 00:21:29.070 } 00:21:29.070 }, 00:21:29.070 "base_bdevs_list": [ 00:21:29.070 { 00:21:29.070 "name": "spare", 00:21:29.070 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:29.070 "is_configured": true, 00:21:29.070 "data_offset": 2048, 00:21:29.070 "data_size": 63488 00:21:29.070 }, 00:21:29.070 { 00:21:29.070 "name": "BaseBdev2", 00:21:29.070 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:29.070 "is_configured": true, 00:21:29.070 "data_offset": 2048, 00:21:29.070 "data_size": 63488 00:21:29.070 } 00:21:29.070 ] 00:21:29.070 }' 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:29.070 10:16:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:29.330 [2024-06-10 10:16:51.029210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:29.330 [2024-06-10 10:16:51.057229] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:29.330 [2024-06-10 10:16:51.057263] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.330 [2024-06-10 10:16:51.057274] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:29.330 [2024-06-10 10:16:51.057278] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.330 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.590 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.590 "name": "raid_bdev1", 00:21:29.590 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:29.590 "strip_size_kb": 0, 00:21:29.590 "state": "online", 00:21:29.590 "raid_level": "raid1", 00:21:29.590 "superblock": true, 00:21:29.590 "num_base_bdevs": 2, 00:21:29.590 "num_base_bdevs_discovered": 1, 00:21:29.590 "num_base_bdevs_operational": 1, 00:21:29.590 "base_bdevs_list": [ 00:21:29.590 { 00:21:29.590 "name": null, 00:21:29.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.590 "is_configured": false, 00:21:29.590 "data_offset": 2048, 00:21:29.590 "data_size": 63488 00:21:29.590 }, 00:21:29.590 { 00:21:29.590 "name": "BaseBdev2", 00:21:29.590 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:29.590 "is_configured": true, 00:21:29.590 "data_offset": 2048, 00:21:29.590 "data_size": 63488 00:21:29.590 } 00:21:29.590 ] 00:21:29.590 }' 00:21:29.590 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.590 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:30.159 10:16:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:30.159 [2024-06-10 10:16:51.995763] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:30.159 [2024-06-10 10:16:51.995800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.159 [2024-06-10 10:16:51.995815] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1746730 00:21:30.159 [2024-06-10 10:16:51.995825] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.159 [2024-06-10 10:16:51.996139] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.159 [2024-06-10 10:16:51.996150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:30.159 [2024-06-10 10:16:51.996212] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:30.159 [2024-06-10 10:16:51.996219] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:30.159 [2024-06-10 10:16:51.996230] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:30.160 [2024-06-10 10:16:51.996241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:30.160 [2024-06-10 10:16:51.999893] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18df7a0 00:21:30.160 spare 00:21:30.160 [2024-06-10 10:16:52.001030] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:30.160 10:16:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:31.540 "name": "raid_bdev1", 00:21:31.540 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:31.540 "strip_size_kb": 0, 00:21:31.540 "state": "online", 00:21:31.540 "raid_level": "raid1", 00:21:31.540 "superblock": true, 00:21:31.540 "num_base_bdevs": 2, 00:21:31.540 "num_base_bdevs_discovered": 2, 00:21:31.540 "num_base_bdevs_operational": 2, 00:21:31.540 "process": { 00:21:31.540 "type": "rebuild", 00:21:31.540 "target": "spare", 00:21:31.540 "progress": { 00:21:31.540 "blocks": 22528, 00:21:31.540 "percent": 35 00:21:31.540 } 00:21:31.540 }, 00:21:31.540 "base_bdevs_list": [ 00:21:31.540 { 00:21:31.540 "name": "spare", 00:21:31.540 "uuid": "5abfef87-40dc-5310-adba-18477cde6192", 00:21:31.540 "is_configured": true, 00:21:31.540 "data_offset": 2048, 00:21:31.540 "data_size": 63488 00:21:31.540 }, 00:21:31.540 { 00:21:31.540 "name": "BaseBdev2", 00:21:31.540 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:31.540 "is_configured": true, 00:21:31.540 "data_offset": 2048, 00:21:31.540 "data_size": 63488 00:21:31.540 } 00:21:31.540 ] 00:21:31.540 }' 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:31.540 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:31.800 [2024-06-10 10:16:53.484424] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:31.800 [2024-06-10 10:16:53.509950] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:31.800 [2024-06-10 10:16:53.509982] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:31.800 [2024-06-10 10:16:53.509992] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:31.800 [2024-06-10 10:16:53.509996] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.800 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.059 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.059 "name": "raid_bdev1", 00:21:32.060 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:32.060 "strip_size_kb": 0, 00:21:32.060 "state": "online", 00:21:32.060 "raid_level": "raid1", 00:21:32.060 "superblock": true, 00:21:32.060 "num_base_bdevs": 2, 00:21:32.060 "num_base_bdevs_discovered": 1, 00:21:32.060 "num_base_bdevs_operational": 1, 00:21:32.060 "base_bdevs_list": [ 00:21:32.060 { 00:21:32.060 "name": null, 00:21:32.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.060 "is_configured": false, 00:21:32.060 "data_offset": 2048, 00:21:32.060 "data_size": 63488 00:21:32.060 }, 00:21:32.060 { 00:21:32.060 "name": "BaseBdev2", 00:21:32.060 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:32.060 "is_configured": true, 00:21:32.060 "data_offset": 2048, 00:21:32.060 "data_size": 63488 00:21:32.060 } 00:21:32.060 ] 00:21:32.060 }' 00:21:32.060 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.060 10:16:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.628 "name": "raid_bdev1", 00:21:32.628 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:32.628 "strip_size_kb": 0, 00:21:32.628 "state": "online", 00:21:32.628 "raid_level": "raid1", 00:21:32.628 "superblock": true, 00:21:32.628 "num_base_bdevs": 2, 00:21:32.628 "num_base_bdevs_discovered": 1, 00:21:32.628 "num_base_bdevs_operational": 1, 00:21:32.628 "base_bdevs_list": [ 00:21:32.628 { 00:21:32.628 "name": null, 00:21:32.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.628 "is_configured": false, 00:21:32.628 "data_offset": 2048, 00:21:32.628 "data_size": 63488 00:21:32.628 }, 00:21:32.628 { 00:21:32.628 "name": "BaseBdev2", 00:21:32.628 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:32.628 "is_configured": true, 00:21:32.628 "data_offset": 2048, 00:21:32.628 "data_size": 63488 00:21:32.628 } 00:21:32.628 ] 00:21:32.628 }' 00:21:32.628 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.888 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:32.888 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.888 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:32.888 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:32.888 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:33.148 [2024-06-10 10:16:54.908668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:33.148 [2024-06-10 10:16:54.908706] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:33.148 [2024-06-10 10:16:54.908718] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e15c0 00:21:33.148 [2024-06-10 10:16:54.908725] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:33.148 [2024-06-10 10:16:54.909009] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:33.148 [2024-06-10 10:16:54.909021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:33.148 [2024-06-10 10:16:54.909068] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:33.148 [2024-06-10 10:16:54.909075] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:33.148 [2024-06-10 10:16:54.909079] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:33.148 BaseBdev1 00:21:33.148 10:16:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.088 10:16:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.348 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.348 "name": "raid_bdev1", 00:21:34.348 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:34.348 "strip_size_kb": 0, 00:21:34.348 "state": "online", 00:21:34.348 "raid_level": "raid1", 00:21:34.348 "superblock": true, 00:21:34.348 "num_base_bdevs": 2, 00:21:34.348 "num_base_bdevs_discovered": 1, 00:21:34.348 "num_base_bdevs_operational": 1, 00:21:34.348 "base_bdevs_list": [ 00:21:34.348 { 00:21:34.348 "name": null, 00:21:34.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:34.348 "is_configured": false, 00:21:34.348 "data_offset": 2048, 00:21:34.348 "data_size": 63488 00:21:34.348 }, 00:21:34.348 { 00:21:34.348 "name": "BaseBdev2", 00:21:34.348 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:34.348 "is_configured": true, 00:21:34.348 "data_offset": 2048, 00:21:34.348 "data_size": 63488 00:21:34.348 } 00:21:34.348 ] 00:21:34.348 }' 00:21:34.348 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.348 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.918 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.178 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:35.178 "name": "raid_bdev1", 00:21:35.178 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:35.178 "strip_size_kb": 0, 00:21:35.178 "state": "online", 00:21:35.179 "raid_level": "raid1", 00:21:35.179 "superblock": true, 00:21:35.179 "num_base_bdevs": 2, 00:21:35.179 "num_base_bdevs_discovered": 1, 00:21:35.179 "num_base_bdevs_operational": 1, 00:21:35.179 "base_bdevs_list": [ 00:21:35.179 { 00:21:35.179 "name": null, 00:21:35.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.179 "is_configured": false, 00:21:35.179 "data_offset": 2048, 00:21:35.179 "data_size": 63488 00:21:35.179 }, 00:21:35.179 { 00:21:35.179 "name": "BaseBdev2", 00:21:35.179 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:35.179 "is_configured": true, 00:21:35.179 "data_offset": 2048, 00:21:35.179 "data_size": 63488 00:21:35.179 } 00:21:35.179 ] 00:21:35.179 }' 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:35.179 10:16:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:35.438 [2024-06-10 10:16:57.126568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:35.438 [2024-06-10 10:16:57.126667] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:35.438 [2024-06-10 10:16:57.126675] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:35.438 request: 00:21:35.438 { 00:21:35.438 "raid_bdev": "raid_bdev1", 00:21:35.438 "base_bdev": "BaseBdev1", 00:21:35.438 "method": "bdev_raid_add_base_bdev", 00:21:35.438 "req_id": 1 00:21:35.438 } 00:21:35.438 Got JSON-RPC error response 00:21:35.438 response: 00:21:35.438 { 00:21:35.438 "code": -22, 00:21:35.438 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:35.438 } 00:21:35.438 10:16:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:21:35.438 10:16:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:21:35.438 10:16:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:21:35.438 10:16:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:21:35.438 10:16:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.377 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.638 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.638 "name": "raid_bdev1", 00:21:36.638 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:36.638 "strip_size_kb": 0, 00:21:36.638 "state": "online", 00:21:36.638 "raid_level": "raid1", 00:21:36.638 "superblock": true, 00:21:36.638 "num_base_bdevs": 2, 00:21:36.638 "num_base_bdevs_discovered": 1, 00:21:36.638 "num_base_bdevs_operational": 1, 00:21:36.638 "base_bdevs_list": [ 00:21:36.638 { 00:21:36.638 "name": null, 00:21:36.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.638 "is_configured": false, 00:21:36.638 "data_offset": 2048, 00:21:36.638 "data_size": 63488 00:21:36.638 }, 00:21:36.638 { 00:21:36.638 "name": "BaseBdev2", 00:21:36.638 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:36.638 "is_configured": true, 00:21:36.638 "data_offset": 2048, 00:21:36.638 "data_size": 63488 00:21:36.638 } 00:21:36.638 ] 00:21:36.638 }' 00:21:36.638 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.638 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:37.208 "name": "raid_bdev1", 00:21:37.208 "uuid": "03ba6b84-0296-4637-8749-6931fef78d4c", 00:21:37.208 "strip_size_kb": 0, 00:21:37.208 "state": "online", 00:21:37.208 "raid_level": "raid1", 00:21:37.208 "superblock": true, 00:21:37.208 "num_base_bdevs": 2, 00:21:37.208 "num_base_bdevs_discovered": 1, 00:21:37.208 "num_base_bdevs_operational": 1, 00:21:37.208 "base_bdevs_list": [ 00:21:37.208 { 00:21:37.208 "name": null, 00:21:37.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.208 "is_configured": false, 00:21:37.208 "data_offset": 2048, 00:21:37.208 "data_size": 63488 00:21:37.208 }, 00:21:37.208 { 00:21:37.208 "name": "BaseBdev2", 00:21:37.208 "uuid": "fb979ac8-4d69-5aaf-b503-aed22306eca3", 00:21:37.208 "is_configured": true, 00:21:37.208 "data_offset": 2048, 00:21:37.208 "data_size": 63488 00:21:37.208 } 00:21:37.208 ] 00:21:37.208 }' 00:21:37.208 10:16:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1081303 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 1081303 ']' 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 1081303 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:37.208 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1081303 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1081303' 00:21:37.468 killing process with pid 1081303 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 1081303 00:21:37.468 Received shutdown signal, test time was about 24.034052 seconds 00:21:37.468 00:21:37.468 Latency(us) 00:21:37.468 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:37.468 =================================================================================================================== 00:21:37.468 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:37.468 [2024-06-10 10:16:59.110401] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:37.468 [2024-06-10 10:16:59.110474] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:37.468 [2024-06-10 10:16:59.110507] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:37.468 [2024-06-10 10:16:59.110513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17463b0 name raid_bdev1, state offline 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 1081303 00:21:37.468 [2024-06-10 10:16:59.122721] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:37.468 00:21:37.468 real 0m27.796s 00:21:37.468 user 0m43.462s 00:21:37.468 sys 0m3.066s 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:37.468 10:16:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:37.468 ************************************ 00:21:37.468 END TEST raid_rebuild_test_sb_io 00:21:37.468 ************************************ 00:21:37.468 10:16:59 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:21:37.468 10:16:59 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:21:37.468 10:16:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:21:37.468 10:16:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:37.468 10:16:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:37.468 ************************************ 00:21:37.468 START TEST raid_rebuild_test 00:21:37.468 ************************************ 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false false true 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1086432 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1086432 /var/tmp/spdk-raid.sock 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 1086432 ']' 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:37.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:37.469 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:37.729 10:16:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.729 [2024-06-10 10:16:59.386785] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:21:37.729 [2024-06-10 10:16:59.386844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1086432 ] 00:21:37.729 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:37.729 Zero copy mechanism will not be used. 00:21:37.729 [2024-06-10 10:16:59.476900] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:37.729 [2024-06-10 10:16:59.543978] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:37.729 [2024-06-10 10:16:59.586155] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:37.729 [2024-06-10 10:16:59.586193] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:38.668 BaseBdev1_malloc 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:38.668 [2024-06-10 10:17:00.491661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:38.668 [2024-06-10 10:17:00.491696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.668 [2024-06-10 10:17:00.491709] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24b0a50 00:21:38.668 [2024-06-10 10:17:00.491716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.668 [2024-06-10 10:17:00.493104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.668 [2024-06-10 10:17:00.493124] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:38.668 BaseBdev1 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:38.668 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:38.927 BaseBdev2_malloc 00:21:38.927 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:39.187 [2024-06-10 10:17:00.886567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:39.187 [2024-06-10 10:17:00.886597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.187 [2024-06-10 10:17:00.886609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24b15a0 00:21:39.187 [2024-06-10 10:17:00.886616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.187 [2024-06-10 10:17:00.887811] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.187 [2024-06-10 10:17:00.887835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:39.187 BaseBdev2 00:21:39.187 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:39.187 10:17:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:39.447 BaseBdev3_malloc 00:21:39.447 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:39.447 [2024-06-10 10:17:01.269436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:39.447 [2024-06-10 10:17:01.269464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.447 [2024-06-10 10:17:01.269474] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265da30 00:21:39.447 [2024-06-10 10:17:01.269480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.447 [2024-06-10 10:17:01.270667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.447 [2024-06-10 10:17:01.270685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:39.447 BaseBdev3 00:21:39.447 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:39.447 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:39.707 BaseBdev4_malloc 00:21:39.707 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:39.967 [2024-06-10 10:17:01.644298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:39.967 [2024-06-10 10:17:01.644328] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.967 [2024-06-10 10:17:01.644339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265c2c0 00:21:39.967 [2024-06-10 10:17:01.644345] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.967 [2024-06-10 10:17:01.645532] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.967 [2024-06-10 10:17:01.645551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:39.967 BaseBdev4 00:21:39.967 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:39.967 spare_malloc 00:21:40.227 10:17:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:40.227 spare_delay 00:21:40.227 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:40.486 [2024-06-10 10:17:02.187374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:40.486 [2024-06-10 10:17:02.187403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.486 [2024-06-10 10:17:02.187414] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26622d0 00:21:40.486 [2024-06-10 10:17:02.187421] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.486 [2024-06-10 10:17:02.188609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.486 [2024-06-10 10:17:02.188629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:40.486 spare 00:21:40.486 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:40.769 [2024-06-10 10:17:02.375878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:40.769 [2024-06-10 10:17:02.376878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:40.769 [2024-06-10 10:17:02.376919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:40.769 [2024-06-10 10:17:02.376954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:40.769 [2024-06-10 10:17:02.377012] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e1e20 00:21:40.769 [2024-06-10 10:17:02.377018] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:40.769 [2024-06-10 10:17:02.377182] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e51d0 00:21:40.769 [2024-06-10 10:17:02.377293] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e1e20 00:21:40.769 [2024-06-10 10:17:02.377298] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25e1e20 00:21:40.769 [2024-06-10 10:17:02.377376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.769 "name": "raid_bdev1", 00:21:40.769 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:40.769 "strip_size_kb": 0, 00:21:40.769 "state": "online", 00:21:40.769 "raid_level": "raid1", 00:21:40.769 "superblock": false, 00:21:40.769 "num_base_bdevs": 4, 00:21:40.769 "num_base_bdevs_discovered": 4, 00:21:40.769 "num_base_bdevs_operational": 4, 00:21:40.769 "base_bdevs_list": [ 00:21:40.769 { 00:21:40.769 "name": "BaseBdev1", 00:21:40.769 "uuid": "b0be0625-5bd0-57fd-9038-d1b0d3f45df7", 00:21:40.769 "is_configured": true, 00:21:40.769 "data_offset": 0, 00:21:40.769 "data_size": 65536 00:21:40.769 }, 00:21:40.769 { 00:21:40.769 "name": "BaseBdev2", 00:21:40.769 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:40.769 "is_configured": true, 00:21:40.769 "data_offset": 0, 00:21:40.769 "data_size": 65536 00:21:40.769 }, 00:21:40.769 { 00:21:40.769 "name": "BaseBdev3", 00:21:40.769 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:40.769 "is_configured": true, 00:21:40.769 "data_offset": 0, 00:21:40.769 "data_size": 65536 00:21:40.769 }, 00:21:40.769 { 00:21:40.769 "name": "BaseBdev4", 00:21:40.769 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:40.769 "is_configured": true, 00:21:40.769 "data_offset": 0, 00:21:40.769 "data_size": 65536 00:21:40.769 } 00:21:40.769 ] 00:21:40.769 }' 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.769 10:17:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.345 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:41.345 10:17:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:41.345 [2024-06-10 10:17:03.166072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:41.345 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:41.345 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.345 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:41.638 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:41.899 [2024-06-10 10:17:03.550903] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e19c0 00:21:41.899 /dev/nbd0 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:41.899 1+0 records in 00:21:41.899 1+0 records out 00:21:41.899 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227093 s, 18.0 MB/s 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:41.899 10:17:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:51.891 65536+0 records in 00:21:51.891 65536+0 records out 00:21:51.891 33554432 bytes (34 MB, 32 MiB) copied, 8.91754 s, 3.8 MB/s 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:51.891 [2024-06-10 10:17:12.720514] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:51.891 [2024-06-10 10:17:12.897161] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.891 10:17:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.891 10:17:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.891 "name": "raid_bdev1", 00:21:51.891 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:51.891 "strip_size_kb": 0, 00:21:51.891 "state": "online", 00:21:51.891 "raid_level": "raid1", 00:21:51.891 "superblock": false, 00:21:51.891 "num_base_bdevs": 4, 00:21:51.891 "num_base_bdevs_discovered": 3, 00:21:51.891 "num_base_bdevs_operational": 3, 00:21:51.891 "base_bdevs_list": [ 00:21:51.891 { 00:21:51.891 "name": null, 00:21:51.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.891 "is_configured": false, 00:21:51.891 "data_offset": 0, 00:21:51.891 "data_size": 65536 00:21:51.891 }, 00:21:51.891 { 00:21:51.891 "name": "BaseBdev2", 00:21:51.891 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:51.891 "is_configured": true, 00:21:51.891 "data_offset": 0, 00:21:51.891 "data_size": 65536 00:21:51.891 }, 00:21:51.891 { 00:21:51.891 "name": "BaseBdev3", 00:21:51.891 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:51.891 "is_configured": true, 00:21:51.891 "data_offset": 0, 00:21:51.891 "data_size": 65536 00:21:51.891 }, 00:21:51.891 { 00:21:51.891 "name": "BaseBdev4", 00:21:51.891 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:51.891 "is_configured": true, 00:21:51.891 "data_offset": 0, 00:21:51.891 "data_size": 65536 00:21:51.891 } 00:21:51.891 ] 00:21:51.891 }' 00:21:51.891 10:17:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.891 10:17:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.891 10:17:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:52.151 [2024-06-10 10:17:13.759350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:52.151 [2024-06-10 10:17:13.762129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e51a0 00:21:52.151 [2024-06-10 10:17:13.763762] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:52.151 10:17:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.090 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.350 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:53.350 "name": "raid_bdev1", 00:21:53.350 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:53.350 "strip_size_kb": 0, 00:21:53.350 "state": "online", 00:21:53.350 "raid_level": "raid1", 00:21:53.350 "superblock": false, 00:21:53.350 "num_base_bdevs": 4, 00:21:53.350 "num_base_bdevs_discovered": 4, 00:21:53.350 "num_base_bdevs_operational": 4, 00:21:53.350 "process": { 00:21:53.350 "type": "rebuild", 00:21:53.350 "target": "spare", 00:21:53.350 "progress": { 00:21:53.350 "blocks": 22528, 00:21:53.350 "percent": 34 00:21:53.350 } 00:21:53.350 }, 00:21:53.350 "base_bdevs_list": [ 00:21:53.350 { 00:21:53.350 "name": "spare", 00:21:53.350 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:53.350 "is_configured": true, 00:21:53.350 "data_offset": 0, 00:21:53.350 "data_size": 65536 00:21:53.350 }, 00:21:53.350 { 00:21:53.350 "name": "BaseBdev2", 00:21:53.350 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:53.350 "is_configured": true, 00:21:53.350 "data_offset": 0, 00:21:53.350 "data_size": 65536 00:21:53.350 }, 00:21:53.350 { 00:21:53.350 "name": "BaseBdev3", 00:21:53.350 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:53.350 "is_configured": true, 00:21:53.350 "data_offset": 0, 00:21:53.350 "data_size": 65536 00:21:53.350 }, 00:21:53.350 { 00:21:53.350 "name": "BaseBdev4", 00:21:53.350 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:53.350 "is_configured": true, 00:21:53.350 "data_offset": 0, 00:21:53.350 "data_size": 65536 00:21:53.350 } 00:21:53.350 ] 00:21:53.350 }' 00:21:53.350 10:17:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:53.350 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:53.350 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:53.350 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:53.350 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:53.610 [2024-06-10 10:17:15.224536] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:53.610 [2024-06-10 10:17:15.272645] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:53.610 [2024-06-10 10:17:15.272674] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:53.610 [2024-06-10 10:17:15.272685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:53.610 [2024-06-10 10:17:15.272689] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.610 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.871 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.871 "name": "raid_bdev1", 00:21:53.871 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:53.871 "strip_size_kb": 0, 00:21:53.871 "state": "online", 00:21:53.871 "raid_level": "raid1", 00:21:53.871 "superblock": false, 00:21:53.871 "num_base_bdevs": 4, 00:21:53.871 "num_base_bdevs_discovered": 3, 00:21:53.871 "num_base_bdevs_operational": 3, 00:21:53.871 "base_bdevs_list": [ 00:21:53.871 { 00:21:53.871 "name": null, 00:21:53.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.871 "is_configured": false, 00:21:53.871 "data_offset": 0, 00:21:53.871 "data_size": 65536 00:21:53.871 }, 00:21:53.871 { 00:21:53.871 "name": "BaseBdev2", 00:21:53.871 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:53.871 "is_configured": true, 00:21:53.871 "data_offset": 0, 00:21:53.871 "data_size": 65536 00:21:53.871 }, 00:21:53.871 { 00:21:53.871 "name": "BaseBdev3", 00:21:53.871 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:53.871 "is_configured": true, 00:21:53.871 "data_offset": 0, 00:21:53.871 "data_size": 65536 00:21:53.871 }, 00:21:53.871 { 00:21:53.871 "name": "BaseBdev4", 00:21:53.871 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:53.871 "is_configured": true, 00:21:53.871 "data_offset": 0, 00:21:53.871 "data_size": 65536 00:21:53.871 } 00:21:53.871 ] 00:21:53.871 }' 00:21:53.871 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.871 10:17:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.132 10:17:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:54.392 "name": "raid_bdev1", 00:21:54.392 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:54.392 "strip_size_kb": 0, 00:21:54.392 "state": "online", 00:21:54.392 "raid_level": "raid1", 00:21:54.392 "superblock": false, 00:21:54.392 "num_base_bdevs": 4, 00:21:54.392 "num_base_bdevs_discovered": 3, 00:21:54.392 "num_base_bdevs_operational": 3, 00:21:54.392 "base_bdevs_list": [ 00:21:54.392 { 00:21:54.392 "name": null, 00:21:54.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.392 "is_configured": false, 00:21:54.392 "data_offset": 0, 00:21:54.392 "data_size": 65536 00:21:54.392 }, 00:21:54.392 { 00:21:54.392 "name": "BaseBdev2", 00:21:54.392 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:54.392 "is_configured": true, 00:21:54.392 "data_offset": 0, 00:21:54.392 "data_size": 65536 00:21:54.392 }, 00:21:54.392 { 00:21:54.392 "name": "BaseBdev3", 00:21:54.392 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:54.392 "is_configured": true, 00:21:54.392 "data_offset": 0, 00:21:54.392 "data_size": 65536 00:21:54.392 }, 00:21:54.392 { 00:21:54.392 "name": "BaseBdev4", 00:21:54.392 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:54.392 "is_configured": true, 00:21:54.392 "data_offset": 0, 00:21:54.392 "data_size": 65536 00:21:54.392 } 00:21:54.392 ] 00:21:54.392 }' 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:54.392 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:54.652 [2024-06-10 10:17:16.399565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:54.652 [2024-06-10 10:17:16.402297] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2660e10 00:21:54.652 [2024-06-10 10:17:16.403461] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:54.652 10:17:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:55.591 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:55.591 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:55.592 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:55.592 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:55.592 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:55.592 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.592 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:55.852 "name": "raid_bdev1", 00:21:55.852 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:55.852 "strip_size_kb": 0, 00:21:55.852 "state": "online", 00:21:55.852 "raid_level": "raid1", 00:21:55.852 "superblock": false, 00:21:55.852 "num_base_bdevs": 4, 00:21:55.852 "num_base_bdevs_discovered": 4, 00:21:55.852 "num_base_bdevs_operational": 4, 00:21:55.852 "process": { 00:21:55.852 "type": "rebuild", 00:21:55.852 "target": "spare", 00:21:55.852 "progress": { 00:21:55.852 "blocks": 22528, 00:21:55.852 "percent": 34 00:21:55.852 } 00:21:55.852 }, 00:21:55.852 "base_bdevs_list": [ 00:21:55.852 { 00:21:55.852 "name": "spare", 00:21:55.852 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:55.852 "is_configured": true, 00:21:55.852 "data_offset": 0, 00:21:55.852 "data_size": 65536 00:21:55.852 }, 00:21:55.852 { 00:21:55.852 "name": "BaseBdev2", 00:21:55.852 "uuid": "53123294-5f18-548b-8f78-f2b5514d2dc7", 00:21:55.852 "is_configured": true, 00:21:55.852 "data_offset": 0, 00:21:55.852 "data_size": 65536 00:21:55.852 }, 00:21:55.852 { 00:21:55.852 "name": "BaseBdev3", 00:21:55.852 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:55.852 "is_configured": true, 00:21:55.852 "data_offset": 0, 00:21:55.852 "data_size": 65536 00:21:55.852 }, 00:21:55.852 { 00:21:55.852 "name": "BaseBdev4", 00:21:55.852 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:55.852 "is_configured": true, 00:21:55.852 "data_offset": 0, 00:21:55.852 "data_size": 65536 00:21:55.852 } 00:21:55.852 ] 00:21:55.852 }' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:21:55.852 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:56.113 [2024-06-10 10:17:17.856224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:56.113 [2024-06-10 10:17:17.912334] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2660e10 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.113 10:17:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:56.373 "name": "raid_bdev1", 00:21:56.373 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:56.373 "strip_size_kb": 0, 00:21:56.373 "state": "online", 00:21:56.373 "raid_level": "raid1", 00:21:56.373 "superblock": false, 00:21:56.373 "num_base_bdevs": 4, 00:21:56.373 "num_base_bdevs_discovered": 3, 00:21:56.373 "num_base_bdevs_operational": 3, 00:21:56.373 "process": { 00:21:56.373 "type": "rebuild", 00:21:56.373 "target": "spare", 00:21:56.373 "progress": { 00:21:56.373 "blocks": 34816, 00:21:56.373 "percent": 53 00:21:56.373 } 00:21:56.373 }, 00:21:56.373 "base_bdevs_list": [ 00:21:56.373 { 00:21:56.373 "name": "spare", 00:21:56.373 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:56.373 "is_configured": true, 00:21:56.373 "data_offset": 0, 00:21:56.373 "data_size": 65536 00:21:56.373 }, 00:21:56.373 { 00:21:56.373 "name": null, 00:21:56.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.373 "is_configured": false, 00:21:56.373 "data_offset": 0, 00:21:56.373 "data_size": 65536 00:21:56.373 }, 00:21:56.373 { 00:21:56.373 "name": "BaseBdev3", 00:21:56.373 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:56.373 "is_configured": true, 00:21:56.373 "data_offset": 0, 00:21:56.373 "data_size": 65536 00:21:56.373 }, 00:21:56.373 { 00:21:56.373 "name": "BaseBdev4", 00:21:56.373 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:56.373 "is_configured": true, 00:21:56.373 "data_offset": 0, 00:21:56.373 "data_size": 65536 00:21:56.373 } 00:21:56.373 ] 00:21:56.373 }' 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=730 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.373 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.633 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:56.633 "name": "raid_bdev1", 00:21:56.633 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:56.633 "strip_size_kb": 0, 00:21:56.633 "state": "online", 00:21:56.633 "raid_level": "raid1", 00:21:56.633 "superblock": false, 00:21:56.633 "num_base_bdevs": 4, 00:21:56.633 "num_base_bdevs_discovered": 3, 00:21:56.633 "num_base_bdevs_operational": 3, 00:21:56.633 "process": { 00:21:56.633 "type": "rebuild", 00:21:56.633 "target": "spare", 00:21:56.633 "progress": { 00:21:56.633 "blocks": 38912, 00:21:56.633 "percent": 59 00:21:56.633 } 00:21:56.633 }, 00:21:56.633 "base_bdevs_list": [ 00:21:56.633 { 00:21:56.633 "name": "spare", 00:21:56.633 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:56.633 "is_configured": true, 00:21:56.633 "data_offset": 0, 00:21:56.633 "data_size": 65536 00:21:56.633 }, 00:21:56.633 { 00:21:56.633 "name": null, 00:21:56.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.633 "is_configured": false, 00:21:56.633 "data_offset": 0, 00:21:56.633 "data_size": 65536 00:21:56.633 }, 00:21:56.633 { 00:21:56.633 "name": "BaseBdev3", 00:21:56.633 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:56.633 "is_configured": true, 00:21:56.633 "data_offset": 0, 00:21:56.633 "data_size": 65536 00:21:56.633 }, 00:21:56.633 { 00:21:56.633 "name": "BaseBdev4", 00:21:56.633 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:56.633 "is_configured": true, 00:21:56.633 "data_offset": 0, 00:21:56.633 "data_size": 65536 00:21:56.633 } 00:21:56.633 ] 00:21:56.633 }' 00:21:56.634 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:56.634 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:56.634 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:56.894 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:56.894 10:17:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.836 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.836 [2024-06-10 10:17:19.622267] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:57.836 [2024-06-10 10:17:19.622309] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:57.836 [2024-06-10 10:17:19.622334] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:58.096 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:58.096 "name": "raid_bdev1", 00:21:58.096 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:58.096 "strip_size_kb": 0, 00:21:58.096 "state": "online", 00:21:58.096 "raid_level": "raid1", 00:21:58.096 "superblock": false, 00:21:58.096 "num_base_bdevs": 4, 00:21:58.096 "num_base_bdevs_discovered": 3, 00:21:58.096 "num_base_bdevs_operational": 3, 00:21:58.096 "base_bdevs_list": [ 00:21:58.096 { 00:21:58.096 "name": "spare", 00:21:58.096 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:58.096 "is_configured": true, 00:21:58.096 "data_offset": 0, 00:21:58.096 "data_size": 65536 00:21:58.096 }, 00:21:58.096 { 00:21:58.096 "name": null, 00:21:58.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.096 "is_configured": false, 00:21:58.096 "data_offset": 0, 00:21:58.096 "data_size": 65536 00:21:58.096 }, 00:21:58.096 { 00:21:58.096 "name": "BaseBdev3", 00:21:58.096 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:58.096 "is_configured": true, 00:21:58.096 "data_offset": 0, 00:21:58.096 "data_size": 65536 00:21:58.096 }, 00:21:58.096 { 00:21:58.096 "name": "BaseBdev4", 00:21:58.096 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:58.096 "is_configured": true, 00:21:58.096 "data_offset": 0, 00:21:58.096 "data_size": 65536 00:21:58.096 } 00:21:58.096 ] 00:21:58.096 }' 00:21:58.096 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.097 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.357 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:58.357 "name": "raid_bdev1", 00:21:58.357 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:58.357 "strip_size_kb": 0, 00:21:58.357 "state": "online", 00:21:58.357 "raid_level": "raid1", 00:21:58.357 "superblock": false, 00:21:58.357 "num_base_bdevs": 4, 00:21:58.357 "num_base_bdevs_discovered": 3, 00:21:58.357 "num_base_bdevs_operational": 3, 00:21:58.357 "base_bdevs_list": [ 00:21:58.357 { 00:21:58.357 "name": "spare", 00:21:58.357 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:58.357 "is_configured": true, 00:21:58.357 "data_offset": 0, 00:21:58.357 "data_size": 65536 00:21:58.357 }, 00:21:58.357 { 00:21:58.357 "name": null, 00:21:58.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.357 "is_configured": false, 00:21:58.357 "data_offset": 0, 00:21:58.357 "data_size": 65536 00:21:58.357 }, 00:21:58.357 { 00:21:58.357 "name": "BaseBdev3", 00:21:58.357 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:58.357 "is_configured": true, 00:21:58.357 "data_offset": 0, 00:21:58.357 "data_size": 65536 00:21:58.357 }, 00:21:58.357 { 00:21:58.357 "name": "BaseBdev4", 00:21:58.357 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:58.357 "is_configured": true, 00:21:58.357 "data_offset": 0, 00:21:58.357 "data_size": 65536 00:21:58.357 } 00:21:58.357 ] 00:21:58.357 }' 00:21:58.357 10:17:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.357 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.617 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.617 "name": "raid_bdev1", 00:21:58.617 "uuid": "d6d00d38-679b-4cc2-8784-d176733032b9", 00:21:58.617 "strip_size_kb": 0, 00:21:58.617 "state": "online", 00:21:58.617 "raid_level": "raid1", 00:21:58.617 "superblock": false, 00:21:58.617 "num_base_bdevs": 4, 00:21:58.617 "num_base_bdevs_discovered": 3, 00:21:58.617 "num_base_bdevs_operational": 3, 00:21:58.617 "base_bdevs_list": [ 00:21:58.617 { 00:21:58.617 "name": "spare", 00:21:58.617 "uuid": "9103341a-76b2-5361-a8c5-d5aee6a5baa0", 00:21:58.617 "is_configured": true, 00:21:58.617 "data_offset": 0, 00:21:58.617 "data_size": 65536 00:21:58.617 }, 00:21:58.617 { 00:21:58.617 "name": null, 00:21:58.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.617 "is_configured": false, 00:21:58.617 "data_offset": 0, 00:21:58.617 "data_size": 65536 00:21:58.617 }, 00:21:58.617 { 00:21:58.617 "name": "BaseBdev3", 00:21:58.617 "uuid": "45ab97b3-3e6d-5ff6-8304-a507e3b273a7", 00:21:58.617 "is_configured": true, 00:21:58.617 "data_offset": 0, 00:21:58.618 "data_size": 65536 00:21:58.618 }, 00:21:58.618 { 00:21:58.618 "name": "BaseBdev4", 00:21:58.618 "uuid": "33182848-25d5-5f08-8f11-aa20ab0588a7", 00:21:58.618 "is_configured": true, 00:21:58.618 "data_offset": 0, 00:21:58.618 "data_size": 65536 00:21:58.618 } 00:21:58.618 ] 00:21:58.618 }' 00:21:58.618 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.618 10:17:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.877 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:59.137 [2024-06-10 10:17:20.860799] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:59.137 [2024-06-10 10:17:20.860819] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:59.137 [2024-06-10 10:17:20.860869] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:59.137 [2024-06-10 10:17:20.860922] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:59.137 [2024-06-10 10:17:20.860928] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e1e20 name raid_bdev1, state offline 00:21:59.137 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.137 10:17:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:59.397 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:59.658 /dev/nbd0 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:59.658 1+0 records in 00:21:59.658 1+0 records out 00:21:59.658 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274918 s, 14.9 MB/s 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:59.658 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:59.658 /dev/nbd1 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:59.918 1+0 records in 00:21:59.918 1+0 records out 00:21:59.918 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275153 s, 14.9 MB/s 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:59.918 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:00.179 10:17:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1086432 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 1086432 ']' 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 1086432 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:00.179 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1086432 00:22:00.457 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:00.457 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:00.457 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1086432' 00:22:00.457 killing process with pid 1086432 00:22:00.457 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 1086432 00:22:00.457 Received shutdown signal, test time was about 60.000000 seconds 00:22:00.458 00:22:00.458 Latency(us) 00:22:00.458 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:00.458 =================================================================================================================== 00:22:00.458 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:00.458 [2024-06-10 10:17:22.086517] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 1086432 00:22:00.458 [2024-06-10 10:17:22.112482] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:00.458 00:22:00.458 real 0m22.911s 00:22:00.458 user 0m29.782s 00:22:00.458 sys 0m4.191s 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.458 ************************************ 00:22:00.458 END TEST raid_rebuild_test 00:22:00.458 ************************************ 00:22:00.458 10:17:22 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:22:00.458 10:17:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:00.458 10:17:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:00.458 10:17:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:00.458 ************************************ 00:22:00.458 START TEST raid_rebuild_test_sb 00:22:00.458 ************************************ 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true false true 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:00.458 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:00.459 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1090356 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1090356 /var/tmp/spdk-raid.sock 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1090356 ']' 00:22:00.460 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:00.461 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:00.461 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:00.461 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:00.461 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:00.461 10:17:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:00.461 10:17:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:00.724 [2024-06-10 10:17:22.364412] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:22:00.724 [2024-06-10 10:17:22.364459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1090356 ] 00:22:00.724 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:00.724 Zero copy mechanism will not be used. 00:22:00.724 [2024-06-10 10:17:22.450402] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:00.724 [2024-06-10 10:17:22.512907] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:00.724 [2024-06-10 10:17:22.556152] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:00.724 [2024-06-10 10:17:22.556176] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.664 10:17:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:01.664 10:17:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:22:01.664 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:01.664 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:01.664 BaseBdev1_malloc 00:22:01.664 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:01.924 [2024-06-10 10:17:23.546136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:01.924 [2024-06-10 10:17:23.546170] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.924 [2024-06-10 10:17:23.546186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc38a50 00:22:01.924 [2024-06-10 10:17:23.546192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.924 [2024-06-10 10:17:23.547485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.924 [2024-06-10 10:17:23.547507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:01.924 BaseBdev1 00:22:01.924 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:01.924 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:01.924 BaseBdev2_malloc 00:22:01.924 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:02.184 [2024-06-10 10:17:23.913060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:02.184 [2024-06-10 10:17:23.913091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.184 [2024-06-10 10:17:23.913105] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc395a0 00:22:02.184 [2024-06-10 10:17:23.913111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.184 [2024-06-10 10:17:23.914276] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.184 [2024-06-10 10:17:23.914294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:02.184 BaseBdev2 00:22:02.184 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:02.184 10:17:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:02.444 BaseBdev3_malloc 00:22:02.444 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:02.444 [2024-06-10 10:17:24.291933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:02.444 [2024-06-10 10:17:24.291959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.444 [2024-06-10 10:17:24.291971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde5a30 00:22:02.444 [2024-06-10 10:17:24.291978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.444 [2024-06-10 10:17:24.293159] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.444 [2024-06-10 10:17:24.293176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:02.444 BaseBdev3 00:22:02.444 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:02.444 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:02.704 BaseBdev4_malloc 00:22:02.704 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:02.965 [2024-06-10 10:17:24.658672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:02.965 [2024-06-10 10:17:24.658700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.965 [2024-06-10 10:17:24.658712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde42c0 00:22:02.965 [2024-06-10 10:17:24.658718] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.965 [2024-06-10 10:17:24.659887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.965 [2024-06-10 10:17:24.659905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:02.965 BaseBdev4 00:22:02.965 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:03.225 spare_malloc 00:22:03.225 10:17:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:03.225 spare_delay 00:22:03.225 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:03.485 [2024-06-10 10:17:25.225986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:03.485 [2024-06-10 10:17:25.226013] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.485 [2024-06-10 10:17:25.226025] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdea2d0 00:22:03.485 [2024-06-10 10:17:25.226031] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.485 [2024-06-10 10:17:25.227228] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.485 [2024-06-10 10:17:25.227246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:03.485 spare 00:22:03.485 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:03.745 [2024-06-10 10:17:25.402457] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:03.745 [2024-06-10 10:17:25.403442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:03.745 [2024-06-10 10:17:25.403484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:03.745 [2024-06-10 10:17:25.403517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:03.745 [2024-06-10 10:17:25.403661] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd69e20 00:22:03.745 [2024-06-10 10:17:25.403668] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:03.745 [2024-06-10 10:17:25.403817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd69dc0 00:22:03.745 [2024-06-10 10:17:25.403941] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd69e20 00:22:03.745 [2024-06-10 10:17:25.403947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd69e20 00:22:03.745 [2024-06-10 10:17:25.404013] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.745 "name": "raid_bdev1", 00:22:03.745 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:03.745 "strip_size_kb": 0, 00:22:03.745 "state": "online", 00:22:03.745 "raid_level": "raid1", 00:22:03.745 "superblock": true, 00:22:03.745 "num_base_bdevs": 4, 00:22:03.745 "num_base_bdevs_discovered": 4, 00:22:03.745 "num_base_bdevs_operational": 4, 00:22:03.745 "base_bdevs_list": [ 00:22:03.745 { 00:22:03.745 "name": "BaseBdev1", 00:22:03.745 "uuid": "8698eff1-0479-562e-9796-5d07ef7f7186", 00:22:03.745 "is_configured": true, 00:22:03.745 "data_offset": 2048, 00:22:03.745 "data_size": 63488 00:22:03.745 }, 00:22:03.745 { 00:22:03.745 "name": "BaseBdev2", 00:22:03.745 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:03.745 "is_configured": true, 00:22:03.745 "data_offset": 2048, 00:22:03.745 "data_size": 63488 00:22:03.745 }, 00:22:03.745 { 00:22:03.745 "name": "BaseBdev3", 00:22:03.745 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:03.745 "is_configured": true, 00:22:03.745 "data_offset": 2048, 00:22:03.745 "data_size": 63488 00:22:03.745 }, 00:22:03.745 { 00:22:03.745 "name": "BaseBdev4", 00:22:03.745 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:03.745 "is_configured": true, 00:22:03.745 "data_offset": 2048, 00:22:03.745 "data_size": 63488 00:22:03.745 } 00:22:03.745 ] 00:22:03.745 }' 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.745 10:17:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:04.318 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:04.318 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:04.579 [2024-06-10 10:17:26.337001] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.579 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:04.579 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.579 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:04.840 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:05.101 [2024-06-10 10:17:26.725793] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc38160 00:22:05.101 /dev/nbd0 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:05.101 1+0 records in 00:22:05.101 1+0 records out 00:22:05.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272987 s, 15.0 MB/s 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:05.101 10:17:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:13.250 63488+0 records in 00:22:13.250 63488+0 records out 00:22:13.250 32505856 bytes (33 MB, 31 MiB) copied, 8.2047 s, 4.0 MB/s 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:13.250 10:17:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:13.511 [2024-06-10 10:17:35.183511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:13.511 [2024-06-10 10:17:35.361055] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:13.511 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.512 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.773 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.773 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.773 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.773 "name": "raid_bdev1", 00:22:13.773 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:13.773 "strip_size_kb": 0, 00:22:13.773 "state": "online", 00:22:13.773 "raid_level": "raid1", 00:22:13.773 "superblock": true, 00:22:13.773 "num_base_bdevs": 4, 00:22:13.773 "num_base_bdevs_discovered": 3, 00:22:13.773 "num_base_bdevs_operational": 3, 00:22:13.773 "base_bdevs_list": [ 00:22:13.773 { 00:22:13.773 "name": null, 00:22:13.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.773 "is_configured": false, 00:22:13.773 "data_offset": 2048, 00:22:13.773 "data_size": 63488 00:22:13.773 }, 00:22:13.773 { 00:22:13.773 "name": "BaseBdev2", 00:22:13.773 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:13.773 "is_configured": true, 00:22:13.773 "data_offset": 2048, 00:22:13.773 "data_size": 63488 00:22:13.773 }, 00:22:13.773 { 00:22:13.773 "name": "BaseBdev3", 00:22:13.773 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:13.773 "is_configured": true, 00:22:13.773 "data_offset": 2048, 00:22:13.773 "data_size": 63488 00:22:13.773 }, 00:22:13.773 { 00:22:13.773 "name": "BaseBdev4", 00:22:13.773 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:13.773 "is_configured": true, 00:22:13.773 "data_offset": 2048, 00:22:13.773 "data_size": 63488 00:22:13.773 } 00:22:13.773 ] 00:22:13.773 }' 00:22:13.773 10:17:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.773 10:17:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:14.383 10:17:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:14.650 [2024-06-10 10:17:36.275384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:14.650 [2024-06-10 10:17:36.278246] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc381e0 00:22:14.650 [2024-06-10 10:17:36.279891] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:14.650 10:17:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.591 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.853 "name": "raid_bdev1", 00:22:15.853 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:15.853 "strip_size_kb": 0, 00:22:15.853 "state": "online", 00:22:15.853 "raid_level": "raid1", 00:22:15.853 "superblock": true, 00:22:15.853 "num_base_bdevs": 4, 00:22:15.853 "num_base_bdevs_discovered": 4, 00:22:15.853 "num_base_bdevs_operational": 4, 00:22:15.853 "process": { 00:22:15.853 "type": "rebuild", 00:22:15.853 "target": "spare", 00:22:15.853 "progress": { 00:22:15.853 "blocks": 22528, 00:22:15.853 "percent": 35 00:22:15.853 } 00:22:15.853 }, 00:22:15.853 "base_bdevs_list": [ 00:22:15.853 { 00:22:15.853 "name": "spare", 00:22:15.853 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:15.853 "is_configured": true, 00:22:15.853 "data_offset": 2048, 00:22:15.853 "data_size": 63488 00:22:15.853 }, 00:22:15.853 { 00:22:15.853 "name": "BaseBdev2", 00:22:15.853 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:15.853 "is_configured": true, 00:22:15.853 "data_offset": 2048, 00:22:15.853 "data_size": 63488 00:22:15.853 }, 00:22:15.853 { 00:22:15.853 "name": "BaseBdev3", 00:22:15.853 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:15.853 "is_configured": true, 00:22:15.853 "data_offset": 2048, 00:22:15.853 "data_size": 63488 00:22:15.853 }, 00:22:15.853 { 00:22:15.853 "name": "BaseBdev4", 00:22:15.853 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:15.853 "is_configured": true, 00:22:15.853 "data_offset": 2048, 00:22:15.853 "data_size": 63488 00:22:15.853 } 00:22:15.853 ] 00:22:15.853 }' 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:15.853 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:16.114 [2024-06-10 10:17:37.752617] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:16.114 [2024-06-10 10:17:37.788696] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:16.114 [2024-06-10 10:17:37.788725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:16.114 [2024-06-10 10:17:37.788737] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:16.114 [2024-06-10 10:17:37.788741] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.114 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.115 10:17:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.375 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.375 "name": "raid_bdev1", 00:22:16.375 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:16.375 "strip_size_kb": 0, 00:22:16.375 "state": "online", 00:22:16.375 "raid_level": "raid1", 00:22:16.375 "superblock": true, 00:22:16.375 "num_base_bdevs": 4, 00:22:16.375 "num_base_bdevs_discovered": 3, 00:22:16.375 "num_base_bdevs_operational": 3, 00:22:16.375 "base_bdevs_list": [ 00:22:16.375 { 00:22:16.375 "name": null, 00:22:16.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.375 "is_configured": false, 00:22:16.375 "data_offset": 2048, 00:22:16.375 "data_size": 63488 00:22:16.375 }, 00:22:16.375 { 00:22:16.375 "name": "BaseBdev2", 00:22:16.375 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:16.375 "is_configured": true, 00:22:16.375 "data_offset": 2048, 00:22:16.375 "data_size": 63488 00:22:16.375 }, 00:22:16.375 { 00:22:16.375 "name": "BaseBdev3", 00:22:16.375 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:16.375 "is_configured": true, 00:22:16.375 "data_offset": 2048, 00:22:16.375 "data_size": 63488 00:22:16.375 }, 00:22:16.375 { 00:22:16.375 "name": "BaseBdev4", 00:22:16.375 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:16.375 "is_configured": true, 00:22:16.375 "data_offset": 2048, 00:22:16.375 "data_size": 63488 00:22:16.375 } 00:22:16.375 ] 00:22:16.375 }' 00:22:16.375 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.375 10:17:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:16.947 "name": "raid_bdev1", 00:22:16.947 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:16.947 "strip_size_kb": 0, 00:22:16.947 "state": "online", 00:22:16.947 "raid_level": "raid1", 00:22:16.947 "superblock": true, 00:22:16.947 "num_base_bdevs": 4, 00:22:16.947 "num_base_bdevs_discovered": 3, 00:22:16.947 "num_base_bdevs_operational": 3, 00:22:16.947 "base_bdevs_list": [ 00:22:16.947 { 00:22:16.947 "name": null, 00:22:16.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.947 "is_configured": false, 00:22:16.947 "data_offset": 2048, 00:22:16.947 "data_size": 63488 00:22:16.947 }, 00:22:16.947 { 00:22:16.947 "name": "BaseBdev2", 00:22:16.947 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:16.947 "is_configured": true, 00:22:16.947 "data_offset": 2048, 00:22:16.947 "data_size": 63488 00:22:16.947 }, 00:22:16.947 { 00:22:16.947 "name": "BaseBdev3", 00:22:16.947 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:16.947 "is_configured": true, 00:22:16.947 "data_offset": 2048, 00:22:16.947 "data_size": 63488 00:22:16.947 }, 00:22:16.947 { 00:22:16.947 "name": "BaseBdev4", 00:22:16.947 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:16.947 "is_configured": true, 00:22:16.947 "data_offset": 2048, 00:22:16.947 "data_size": 63488 00:22:16.947 } 00:22:16.947 ] 00:22:16.947 }' 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:16.947 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:17.208 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:17.208 10:17:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:17.208 [2024-06-10 10:17:38.995780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:17.208 [2024-06-10 10:17:38.998498] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc381f0 00:22:17.208 [2024-06-10 10:17:38.999664] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:17.208 10:17:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.151 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.411 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:18.411 "name": "raid_bdev1", 00:22:18.411 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:18.411 "strip_size_kb": 0, 00:22:18.411 "state": "online", 00:22:18.411 "raid_level": "raid1", 00:22:18.411 "superblock": true, 00:22:18.411 "num_base_bdevs": 4, 00:22:18.411 "num_base_bdevs_discovered": 4, 00:22:18.411 "num_base_bdevs_operational": 4, 00:22:18.411 "process": { 00:22:18.411 "type": "rebuild", 00:22:18.411 "target": "spare", 00:22:18.411 "progress": { 00:22:18.411 "blocks": 22528, 00:22:18.411 "percent": 35 00:22:18.411 } 00:22:18.411 }, 00:22:18.411 "base_bdevs_list": [ 00:22:18.411 { 00:22:18.411 "name": "spare", 00:22:18.411 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:18.411 "is_configured": true, 00:22:18.411 "data_offset": 2048, 00:22:18.411 "data_size": 63488 00:22:18.411 }, 00:22:18.411 { 00:22:18.411 "name": "BaseBdev2", 00:22:18.411 "uuid": "e163dcd5-1510-5046-91ee-1a29669d8a32", 00:22:18.411 "is_configured": true, 00:22:18.411 "data_offset": 2048, 00:22:18.411 "data_size": 63488 00:22:18.411 }, 00:22:18.411 { 00:22:18.411 "name": "BaseBdev3", 00:22:18.411 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:18.411 "is_configured": true, 00:22:18.411 "data_offset": 2048, 00:22:18.411 "data_size": 63488 00:22:18.411 }, 00:22:18.411 { 00:22:18.411 "name": "BaseBdev4", 00:22:18.411 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:18.411 "is_configured": true, 00:22:18.411 "data_offset": 2048, 00:22:18.411 "data_size": 63488 00:22:18.411 } 00:22:18.411 ] 00:22:18.411 }' 00:22:18.411 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:18.411 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:18.411 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:18.670 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:18.670 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:18.671 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:18.671 [2024-06-10 10:17:40.464400] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:18.930 [2024-06-10 10:17:40.608756] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc381f0 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.930 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:19.190 "name": "raid_bdev1", 00:22:19.190 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:19.190 "strip_size_kb": 0, 00:22:19.190 "state": "online", 00:22:19.190 "raid_level": "raid1", 00:22:19.190 "superblock": true, 00:22:19.190 "num_base_bdevs": 4, 00:22:19.190 "num_base_bdevs_discovered": 3, 00:22:19.190 "num_base_bdevs_operational": 3, 00:22:19.190 "process": { 00:22:19.190 "type": "rebuild", 00:22:19.190 "target": "spare", 00:22:19.190 "progress": { 00:22:19.190 "blocks": 34816, 00:22:19.190 "percent": 54 00:22:19.190 } 00:22:19.190 }, 00:22:19.190 "base_bdevs_list": [ 00:22:19.190 { 00:22:19.190 "name": "spare", 00:22:19.190 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:19.190 "is_configured": true, 00:22:19.190 "data_offset": 2048, 00:22:19.190 "data_size": 63488 00:22:19.190 }, 00:22:19.190 { 00:22:19.190 "name": null, 00:22:19.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.190 "is_configured": false, 00:22:19.190 "data_offset": 2048, 00:22:19.190 "data_size": 63488 00:22:19.190 }, 00:22:19.190 { 00:22:19.190 "name": "BaseBdev3", 00:22:19.190 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:19.190 "is_configured": true, 00:22:19.190 "data_offset": 2048, 00:22:19.190 "data_size": 63488 00:22:19.190 }, 00:22:19.190 { 00:22:19.190 "name": "BaseBdev4", 00:22:19.190 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:19.190 "is_configured": true, 00:22:19.190 "data_offset": 2048, 00:22:19.190 "data_size": 63488 00:22:19.190 } 00:22:19.190 ] 00:22:19.190 }' 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=752 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.190 10:17:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:19.451 "name": "raid_bdev1", 00:22:19.451 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:19.451 "strip_size_kb": 0, 00:22:19.451 "state": "online", 00:22:19.451 "raid_level": "raid1", 00:22:19.451 "superblock": true, 00:22:19.451 "num_base_bdevs": 4, 00:22:19.451 "num_base_bdevs_discovered": 3, 00:22:19.451 "num_base_bdevs_operational": 3, 00:22:19.451 "process": { 00:22:19.451 "type": "rebuild", 00:22:19.451 "target": "spare", 00:22:19.451 "progress": { 00:22:19.451 "blocks": 38912, 00:22:19.451 "percent": 61 00:22:19.451 } 00:22:19.451 }, 00:22:19.451 "base_bdevs_list": [ 00:22:19.451 { 00:22:19.451 "name": "spare", 00:22:19.451 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:19.451 "is_configured": true, 00:22:19.451 "data_offset": 2048, 00:22:19.451 "data_size": 63488 00:22:19.451 }, 00:22:19.451 { 00:22:19.451 "name": null, 00:22:19.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.451 "is_configured": false, 00:22:19.451 "data_offset": 2048, 00:22:19.451 "data_size": 63488 00:22:19.451 }, 00:22:19.451 { 00:22:19.451 "name": "BaseBdev3", 00:22:19.451 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:19.451 "is_configured": true, 00:22:19.451 "data_offset": 2048, 00:22:19.451 "data_size": 63488 00:22:19.451 }, 00:22:19.451 { 00:22:19.451 "name": "BaseBdev4", 00:22:19.451 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:19.451 "is_configured": true, 00:22:19.451 "data_offset": 2048, 00:22:19.451 "data_size": 63488 00:22:19.451 } 00:22:19.451 ] 00:22:19.451 }' 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:19.451 10:17:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.405 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.405 [2024-06-10 10:17:42.218144] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:20.405 [2024-06-10 10:17:42.218191] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:20.405 [2024-06-10 10:17:42.218267] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:20.670 "name": "raid_bdev1", 00:22:20.670 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:20.670 "strip_size_kb": 0, 00:22:20.670 "state": "online", 00:22:20.670 "raid_level": "raid1", 00:22:20.670 "superblock": true, 00:22:20.670 "num_base_bdevs": 4, 00:22:20.670 "num_base_bdevs_discovered": 3, 00:22:20.670 "num_base_bdevs_operational": 3, 00:22:20.670 "base_bdevs_list": [ 00:22:20.670 { 00:22:20.670 "name": "spare", 00:22:20.670 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:20.670 "is_configured": true, 00:22:20.670 "data_offset": 2048, 00:22:20.670 "data_size": 63488 00:22:20.670 }, 00:22:20.670 { 00:22:20.670 "name": null, 00:22:20.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.670 "is_configured": false, 00:22:20.670 "data_offset": 2048, 00:22:20.670 "data_size": 63488 00:22:20.670 }, 00:22:20.670 { 00:22:20.670 "name": "BaseBdev3", 00:22:20.670 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:20.670 "is_configured": true, 00:22:20.670 "data_offset": 2048, 00:22:20.670 "data_size": 63488 00:22:20.670 }, 00:22:20.670 { 00:22:20.670 "name": "BaseBdev4", 00:22:20.670 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:20.670 "is_configured": true, 00:22:20.670 "data_offset": 2048, 00:22:20.670 "data_size": 63488 00:22:20.670 } 00:22:20.670 ] 00:22:20.670 }' 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.670 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:20.931 "name": "raid_bdev1", 00:22:20.931 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:20.931 "strip_size_kb": 0, 00:22:20.931 "state": "online", 00:22:20.931 "raid_level": "raid1", 00:22:20.931 "superblock": true, 00:22:20.931 "num_base_bdevs": 4, 00:22:20.931 "num_base_bdevs_discovered": 3, 00:22:20.931 "num_base_bdevs_operational": 3, 00:22:20.931 "base_bdevs_list": [ 00:22:20.931 { 00:22:20.931 "name": "spare", 00:22:20.931 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:20.931 "is_configured": true, 00:22:20.931 "data_offset": 2048, 00:22:20.931 "data_size": 63488 00:22:20.931 }, 00:22:20.931 { 00:22:20.931 "name": null, 00:22:20.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.931 "is_configured": false, 00:22:20.931 "data_offset": 2048, 00:22:20.931 "data_size": 63488 00:22:20.931 }, 00:22:20.931 { 00:22:20.931 "name": "BaseBdev3", 00:22:20.931 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:20.931 "is_configured": true, 00:22:20.931 "data_offset": 2048, 00:22:20.931 "data_size": 63488 00:22:20.931 }, 00:22:20.931 { 00:22:20.931 "name": "BaseBdev4", 00:22:20.931 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:20.931 "is_configured": true, 00:22:20.931 "data_offset": 2048, 00:22:20.931 "data_size": 63488 00:22:20.931 } 00:22:20.931 ] 00:22:20.931 }' 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.931 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.192 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.192 "name": "raid_bdev1", 00:22:21.192 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:21.192 "strip_size_kb": 0, 00:22:21.192 "state": "online", 00:22:21.192 "raid_level": "raid1", 00:22:21.192 "superblock": true, 00:22:21.192 "num_base_bdevs": 4, 00:22:21.192 "num_base_bdevs_discovered": 3, 00:22:21.192 "num_base_bdevs_operational": 3, 00:22:21.192 "base_bdevs_list": [ 00:22:21.192 { 00:22:21.192 "name": "spare", 00:22:21.192 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:21.192 "is_configured": true, 00:22:21.192 "data_offset": 2048, 00:22:21.192 "data_size": 63488 00:22:21.192 }, 00:22:21.192 { 00:22:21.192 "name": null, 00:22:21.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.192 "is_configured": false, 00:22:21.192 "data_offset": 2048, 00:22:21.192 "data_size": 63488 00:22:21.192 }, 00:22:21.192 { 00:22:21.192 "name": "BaseBdev3", 00:22:21.192 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:21.192 "is_configured": true, 00:22:21.192 "data_offset": 2048, 00:22:21.192 "data_size": 63488 00:22:21.192 }, 00:22:21.192 { 00:22:21.192 "name": "BaseBdev4", 00:22:21.192 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:21.192 "is_configured": true, 00:22:21.192 "data_offset": 2048, 00:22:21.192 "data_size": 63488 00:22:21.192 } 00:22:21.192 ] 00:22:21.192 }' 00:22:21.192 10:17:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.192 10:17:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.763 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:22.023 [2024-06-10 10:17:43.681666] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:22.023 [2024-06-10 10:17:43.681684] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:22.023 [2024-06-10 10:17:43.681724] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:22.023 [2024-06-10 10:17:43.681777] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:22.023 [2024-06-10 10:17:43.681784] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd69e20 name raid_bdev1, state offline 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:22.023 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:22.024 10:17:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:22.286 /dev/nbd0 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:22.286 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:22.287 1+0 records in 00:22:22.287 1+0 records out 00:22:22.287 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287749 s, 14.2 MB/s 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:22.287 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:22.547 /dev/nbd1 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:22.547 1+0 records in 00:22:22.547 1+0 records out 00:22:22.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269028 s, 15.2 MB/s 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:22.547 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:22.807 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:23.068 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:23.329 10:17:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:23.329 [2024-06-10 10:17:45.113801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:23.329 [2024-06-10 10:17:45.113847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.329 [2024-06-10 10:17:45.113866] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd6d980 00:22:23.329 [2024-06-10 10:17:45.113873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.329 [2024-06-10 10:17:45.115155] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.329 [2024-06-10 10:17:45.115177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:23.329 [2024-06-10 10:17:45.115229] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:23.329 [2024-06-10 10:17:45.115247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:23.329 [2024-06-10 10:17:45.115324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:23.329 [2024-06-10 10:17:45.115380] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:23.329 spare 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.329 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.590 [2024-06-10 10:17:45.215671] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdd4ef0 00:22:23.590 [2024-06-10 10:17:45.215680] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:23.590 [2024-06-10 10:17:45.215840] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd6dc10 00:22:23.590 [2024-06-10 10:17:45.215953] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdd4ef0 00:22:23.590 [2024-06-10 10:17:45.215959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdd4ef0 00:22:23.590 [2024-06-10 10:17:45.216032] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.590 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.590 "name": "raid_bdev1", 00:22:23.590 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:23.590 "strip_size_kb": 0, 00:22:23.590 "state": "online", 00:22:23.590 "raid_level": "raid1", 00:22:23.590 "superblock": true, 00:22:23.590 "num_base_bdevs": 4, 00:22:23.590 "num_base_bdevs_discovered": 3, 00:22:23.590 "num_base_bdevs_operational": 3, 00:22:23.590 "base_bdevs_list": [ 00:22:23.590 { 00:22:23.590 "name": "spare", 00:22:23.590 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:23.590 "is_configured": true, 00:22:23.590 "data_offset": 2048, 00:22:23.590 "data_size": 63488 00:22:23.590 }, 00:22:23.590 { 00:22:23.590 "name": null, 00:22:23.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.590 "is_configured": false, 00:22:23.590 "data_offset": 2048, 00:22:23.590 "data_size": 63488 00:22:23.590 }, 00:22:23.590 { 00:22:23.590 "name": "BaseBdev3", 00:22:23.590 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:23.590 "is_configured": true, 00:22:23.590 "data_offset": 2048, 00:22:23.590 "data_size": 63488 00:22:23.590 }, 00:22:23.590 { 00:22:23.590 "name": "BaseBdev4", 00:22:23.590 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:23.590 "is_configured": true, 00:22:23.590 "data_offset": 2048, 00:22:23.590 "data_size": 63488 00:22:23.590 } 00:22:23.590 ] 00:22:23.590 }' 00:22:23.590 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.590 10:17:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.160 10:17:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:24.420 "name": "raid_bdev1", 00:22:24.420 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:24.420 "strip_size_kb": 0, 00:22:24.420 "state": "online", 00:22:24.420 "raid_level": "raid1", 00:22:24.420 "superblock": true, 00:22:24.420 "num_base_bdevs": 4, 00:22:24.420 "num_base_bdevs_discovered": 3, 00:22:24.420 "num_base_bdevs_operational": 3, 00:22:24.420 "base_bdevs_list": [ 00:22:24.420 { 00:22:24.420 "name": "spare", 00:22:24.420 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:24.420 "is_configured": true, 00:22:24.420 "data_offset": 2048, 00:22:24.420 "data_size": 63488 00:22:24.420 }, 00:22:24.420 { 00:22:24.420 "name": null, 00:22:24.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.420 "is_configured": false, 00:22:24.420 "data_offset": 2048, 00:22:24.420 "data_size": 63488 00:22:24.420 }, 00:22:24.420 { 00:22:24.420 "name": "BaseBdev3", 00:22:24.420 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:24.420 "is_configured": true, 00:22:24.420 "data_offset": 2048, 00:22:24.420 "data_size": 63488 00:22:24.420 }, 00:22:24.420 { 00:22:24.420 "name": "BaseBdev4", 00:22:24.420 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:24.420 "is_configured": true, 00:22:24.420 "data_offset": 2048, 00:22:24.420 "data_size": 63488 00:22:24.420 } 00:22:24.420 ] 00:22:24.420 }' 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:24.420 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:24.680 [2024-06-10 10:17:46.485339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.680 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.941 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.941 "name": "raid_bdev1", 00:22:24.941 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:24.941 "strip_size_kb": 0, 00:22:24.941 "state": "online", 00:22:24.941 "raid_level": "raid1", 00:22:24.941 "superblock": true, 00:22:24.941 "num_base_bdevs": 4, 00:22:24.941 "num_base_bdevs_discovered": 2, 00:22:24.941 "num_base_bdevs_operational": 2, 00:22:24.941 "base_bdevs_list": [ 00:22:24.941 { 00:22:24.941 "name": null, 00:22:24.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.941 "is_configured": false, 00:22:24.941 "data_offset": 2048, 00:22:24.941 "data_size": 63488 00:22:24.941 }, 00:22:24.941 { 00:22:24.941 "name": null, 00:22:24.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.941 "is_configured": false, 00:22:24.941 "data_offset": 2048, 00:22:24.941 "data_size": 63488 00:22:24.941 }, 00:22:24.941 { 00:22:24.941 "name": "BaseBdev3", 00:22:24.941 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:24.941 "is_configured": true, 00:22:24.941 "data_offset": 2048, 00:22:24.941 "data_size": 63488 00:22:24.941 }, 00:22:24.941 { 00:22:24.941 "name": "BaseBdev4", 00:22:24.941 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:24.941 "is_configured": true, 00:22:24.941 "data_offset": 2048, 00:22:24.941 "data_size": 63488 00:22:24.941 } 00:22:24.941 ] 00:22:24.941 }' 00:22:24.941 10:17:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.941 10:17:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:25.512 10:17:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:25.772 [2024-06-10 10:17:47.379623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:25.772 [2024-06-10 10:17:47.379738] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:25.772 [2024-06-10 10:17:47.379748] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:25.772 [2024-06-10 10:17:47.379767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:25.772 [2024-06-10 10:17:47.382495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde3810 00:22:25.772 [2024-06-10 10:17:47.384126] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:25.772 10:17:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.712 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:26.972 "name": "raid_bdev1", 00:22:26.972 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:26.972 "strip_size_kb": 0, 00:22:26.972 "state": "online", 00:22:26.972 "raid_level": "raid1", 00:22:26.972 "superblock": true, 00:22:26.972 "num_base_bdevs": 4, 00:22:26.972 "num_base_bdevs_discovered": 3, 00:22:26.972 "num_base_bdevs_operational": 3, 00:22:26.972 "process": { 00:22:26.972 "type": "rebuild", 00:22:26.972 "target": "spare", 00:22:26.972 "progress": { 00:22:26.972 "blocks": 22528, 00:22:26.972 "percent": 35 00:22:26.972 } 00:22:26.972 }, 00:22:26.972 "base_bdevs_list": [ 00:22:26.972 { 00:22:26.972 "name": "spare", 00:22:26.972 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:26.972 "is_configured": true, 00:22:26.972 "data_offset": 2048, 00:22:26.972 "data_size": 63488 00:22:26.972 }, 00:22:26.972 { 00:22:26.972 "name": null, 00:22:26.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.972 "is_configured": false, 00:22:26.972 "data_offset": 2048, 00:22:26.972 "data_size": 63488 00:22:26.972 }, 00:22:26.972 { 00:22:26.972 "name": "BaseBdev3", 00:22:26.972 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:26.972 "is_configured": true, 00:22:26.972 "data_offset": 2048, 00:22:26.972 "data_size": 63488 00:22:26.972 }, 00:22:26.972 { 00:22:26.972 "name": "BaseBdev4", 00:22:26.972 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:26.972 "is_configured": true, 00:22:26.972 "data_offset": 2048, 00:22:26.972 "data_size": 63488 00:22:26.972 } 00:22:26.972 ] 00:22:26.972 }' 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:26.972 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:27.232 [2024-06-10 10:17:48.864886] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:27.232 [2024-06-10 10:17:48.892921] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:27.232 [2024-06-10 10:17:48.892952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.232 [2024-06-10 10:17:48.892963] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:27.232 [2024-06-10 10:17:48.892967] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.232 10:17:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.493 10:17:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.493 "name": "raid_bdev1", 00:22:27.493 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:27.493 "strip_size_kb": 0, 00:22:27.493 "state": "online", 00:22:27.493 "raid_level": "raid1", 00:22:27.493 "superblock": true, 00:22:27.493 "num_base_bdevs": 4, 00:22:27.493 "num_base_bdevs_discovered": 2, 00:22:27.493 "num_base_bdevs_operational": 2, 00:22:27.493 "base_bdevs_list": [ 00:22:27.493 { 00:22:27.493 "name": null, 00:22:27.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.493 "is_configured": false, 00:22:27.493 "data_offset": 2048, 00:22:27.493 "data_size": 63488 00:22:27.493 }, 00:22:27.493 { 00:22:27.493 "name": null, 00:22:27.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.493 "is_configured": false, 00:22:27.493 "data_offset": 2048, 00:22:27.493 "data_size": 63488 00:22:27.493 }, 00:22:27.493 { 00:22:27.493 "name": "BaseBdev3", 00:22:27.493 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:27.493 "is_configured": true, 00:22:27.493 "data_offset": 2048, 00:22:27.493 "data_size": 63488 00:22:27.493 }, 00:22:27.493 { 00:22:27.493 "name": "BaseBdev4", 00:22:27.493 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:27.493 "is_configured": true, 00:22:27.493 "data_offset": 2048, 00:22:27.493 "data_size": 63488 00:22:27.493 } 00:22:27.493 ] 00:22:27.493 }' 00:22:27.493 10:17:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.493 10:17:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:28.064 10:17:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:28.064 [2024-06-10 10:17:49.835324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:28.064 [2024-06-10 10:17:49.835357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.064 [2024-06-10 10:17:49.835374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd5360 00:22:28.064 [2024-06-10 10:17:49.835381] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.064 [2024-06-10 10:17:49.835677] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.064 [2024-06-10 10:17:49.835689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:28.064 [2024-06-10 10:17:49.835747] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:28.064 [2024-06-10 10:17:49.835754] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:28.064 [2024-06-10 10:17:49.835759] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:28.064 [2024-06-10 10:17:49.835770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:28.064 [2024-06-10 10:17:49.838472] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd6dc10 00:22:28.064 [2024-06-10 10:17:49.839567] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:28.064 spare 00:22:28.064 10:17:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.006 10:17:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.266 "name": "raid_bdev1", 00:22:29.266 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:29.266 "strip_size_kb": 0, 00:22:29.266 "state": "online", 00:22:29.266 "raid_level": "raid1", 00:22:29.266 "superblock": true, 00:22:29.266 "num_base_bdevs": 4, 00:22:29.266 "num_base_bdevs_discovered": 3, 00:22:29.266 "num_base_bdevs_operational": 3, 00:22:29.266 "process": { 00:22:29.266 "type": "rebuild", 00:22:29.266 "target": "spare", 00:22:29.266 "progress": { 00:22:29.266 "blocks": 22528, 00:22:29.266 "percent": 35 00:22:29.266 } 00:22:29.266 }, 00:22:29.266 "base_bdevs_list": [ 00:22:29.266 { 00:22:29.266 "name": "spare", 00:22:29.266 "uuid": "6e140a57-1eaa-5cb3-b0bb-bbe552c7a54c", 00:22:29.266 "is_configured": true, 00:22:29.266 "data_offset": 2048, 00:22:29.266 "data_size": 63488 00:22:29.266 }, 00:22:29.266 { 00:22:29.266 "name": null, 00:22:29.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.266 "is_configured": false, 00:22:29.266 "data_offset": 2048, 00:22:29.266 "data_size": 63488 00:22:29.266 }, 00:22:29.266 { 00:22:29.266 "name": "BaseBdev3", 00:22:29.266 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:29.266 "is_configured": true, 00:22:29.266 "data_offset": 2048, 00:22:29.266 "data_size": 63488 00:22:29.266 }, 00:22:29.266 { 00:22:29.266 "name": "BaseBdev4", 00:22:29.266 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:29.266 "is_configured": true, 00:22:29.266 "data_offset": 2048, 00:22:29.266 "data_size": 63488 00:22:29.266 } 00:22:29.266 ] 00:22:29.266 }' 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:29.266 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:29.526 [2024-06-10 10:17:51.291951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:29.526 [2024-06-10 10:17:51.348365] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:29.526 [2024-06-10 10:17:51.348393] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.526 [2024-06-10 10:17:51.348403] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:29.526 [2024-06-10 10:17:51.348407] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.526 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.787 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.787 "name": "raid_bdev1", 00:22:29.787 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:29.787 "strip_size_kb": 0, 00:22:29.787 "state": "online", 00:22:29.787 "raid_level": "raid1", 00:22:29.787 "superblock": true, 00:22:29.787 "num_base_bdevs": 4, 00:22:29.787 "num_base_bdevs_discovered": 2, 00:22:29.787 "num_base_bdevs_operational": 2, 00:22:29.787 "base_bdevs_list": [ 00:22:29.787 { 00:22:29.787 "name": null, 00:22:29.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.787 "is_configured": false, 00:22:29.787 "data_offset": 2048, 00:22:29.787 "data_size": 63488 00:22:29.787 }, 00:22:29.787 { 00:22:29.787 "name": null, 00:22:29.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.787 "is_configured": false, 00:22:29.787 "data_offset": 2048, 00:22:29.787 "data_size": 63488 00:22:29.787 }, 00:22:29.787 { 00:22:29.787 "name": "BaseBdev3", 00:22:29.787 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:29.787 "is_configured": true, 00:22:29.787 "data_offset": 2048, 00:22:29.787 "data_size": 63488 00:22:29.787 }, 00:22:29.787 { 00:22:29.787 "name": "BaseBdev4", 00:22:29.787 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:29.787 "is_configured": true, 00:22:29.787 "data_offset": 2048, 00:22:29.787 "data_size": 63488 00:22:29.787 } 00:22:29.787 ] 00:22:29.787 }' 00:22:29.787 10:17:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.787 10:17:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.369 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.370 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:30.633 "name": "raid_bdev1", 00:22:30.633 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:30.633 "strip_size_kb": 0, 00:22:30.633 "state": "online", 00:22:30.633 "raid_level": "raid1", 00:22:30.633 "superblock": true, 00:22:30.633 "num_base_bdevs": 4, 00:22:30.633 "num_base_bdevs_discovered": 2, 00:22:30.633 "num_base_bdevs_operational": 2, 00:22:30.633 "base_bdevs_list": [ 00:22:30.633 { 00:22:30.633 "name": null, 00:22:30.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.633 "is_configured": false, 00:22:30.633 "data_offset": 2048, 00:22:30.633 "data_size": 63488 00:22:30.633 }, 00:22:30.633 { 00:22:30.633 "name": null, 00:22:30.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.633 "is_configured": false, 00:22:30.633 "data_offset": 2048, 00:22:30.633 "data_size": 63488 00:22:30.633 }, 00:22:30.633 { 00:22:30.633 "name": "BaseBdev3", 00:22:30.633 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:30.633 "is_configured": true, 00:22:30.633 "data_offset": 2048, 00:22:30.633 "data_size": 63488 00:22:30.633 }, 00:22:30.633 { 00:22:30.633 "name": "BaseBdev4", 00:22:30.633 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:30.633 "is_configured": true, 00:22:30.633 "data_offset": 2048, 00:22:30.633 "data_size": 63488 00:22:30.633 } 00:22:30.633 ] 00:22:30.633 }' 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:30.633 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:30.894 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:30.894 [2024-06-10 10:17:52.751620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:30.894 [2024-06-10 10:17:52.751653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.894 [2024-06-10 10:17:52.751667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde2c30 00:22:30.894 [2024-06-10 10:17:52.751674] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.894 [2024-06-10 10:17:52.751949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.894 [2024-06-10 10:17:52.751962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:30.894 [2024-06-10 10:17:52.752009] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:30.894 [2024-06-10 10:17:52.752016] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:30.894 [2024-06-10 10:17:52.752022] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:30.894 BaseBdev1 00:22:31.154 10:17:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.092 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.352 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.352 "name": "raid_bdev1", 00:22:32.352 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:32.352 "strip_size_kb": 0, 00:22:32.352 "state": "online", 00:22:32.352 "raid_level": "raid1", 00:22:32.352 "superblock": true, 00:22:32.353 "num_base_bdevs": 4, 00:22:32.353 "num_base_bdevs_discovered": 2, 00:22:32.353 "num_base_bdevs_operational": 2, 00:22:32.353 "base_bdevs_list": [ 00:22:32.353 { 00:22:32.353 "name": null, 00:22:32.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.353 "is_configured": false, 00:22:32.353 "data_offset": 2048, 00:22:32.353 "data_size": 63488 00:22:32.353 }, 00:22:32.353 { 00:22:32.353 "name": null, 00:22:32.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.353 "is_configured": false, 00:22:32.353 "data_offset": 2048, 00:22:32.353 "data_size": 63488 00:22:32.353 }, 00:22:32.353 { 00:22:32.353 "name": "BaseBdev3", 00:22:32.353 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:32.353 "is_configured": true, 00:22:32.353 "data_offset": 2048, 00:22:32.353 "data_size": 63488 00:22:32.353 }, 00:22:32.353 { 00:22:32.353 "name": "BaseBdev4", 00:22:32.353 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:32.353 "is_configured": true, 00:22:32.353 "data_offset": 2048, 00:22:32.353 "data_size": 63488 00:22:32.353 } 00:22:32.353 ] 00:22:32.353 }' 00:22:32.353 10:17:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.353 10:17:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.613 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.873 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.873 "name": "raid_bdev1", 00:22:32.873 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:32.873 "strip_size_kb": 0, 00:22:32.873 "state": "online", 00:22:32.873 "raid_level": "raid1", 00:22:32.873 "superblock": true, 00:22:32.873 "num_base_bdevs": 4, 00:22:32.873 "num_base_bdevs_discovered": 2, 00:22:32.873 "num_base_bdevs_operational": 2, 00:22:32.873 "base_bdevs_list": [ 00:22:32.873 { 00:22:32.873 "name": null, 00:22:32.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.873 "is_configured": false, 00:22:32.873 "data_offset": 2048, 00:22:32.873 "data_size": 63488 00:22:32.873 }, 00:22:32.873 { 00:22:32.873 "name": null, 00:22:32.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.873 "is_configured": false, 00:22:32.873 "data_offset": 2048, 00:22:32.873 "data_size": 63488 00:22:32.873 }, 00:22:32.873 { 00:22:32.873 "name": "BaseBdev3", 00:22:32.873 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:32.873 "is_configured": true, 00:22:32.873 "data_offset": 2048, 00:22:32.873 "data_size": 63488 00:22:32.873 }, 00:22:32.873 { 00:22:32.873 "name": "BaseBdev4", 00:22:32.873 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:32.873 "is_configured": true, 00:22:32.873 "data_offset": 2048, 00:22:32.873 "data_size": 63488 00:22:32.873 } 00:22:32.873 ] 00:22:32.873 }' 00:22:32.873 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.873 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:32.873 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.873 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:32.874 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:32.874 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:33.134 [2024-06-10 10:17:54.905085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.134 [2024-06-10 10:17:54.905172] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:33.134 [2024-06-10 10:17:54.905181] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:33.134 request: 00:22:33.134 { 00:22:33.134 "raid_bdev": "raid_bdev1", 00:22:33.134 "base_bdev": "BaseBdev1", 00:22:33.134 "method": "bdev_raid_add_base_bdev", 00:22:33.134 "req_id": 1 00:22:33.134 } 00:22:33.134 Got JSON-RPC error response 00:22:33.134 response: 00:22:33.134 { 00:22:33.134 "code": -22, 00:22:33.134 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:33.134 } 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:22:33.134 10:17:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.074 10:17:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.335 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.335 "name": "raid_bdev1", 00:22:34.335 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:34.335 "strip_size_kb": 0, 00:22:34.335 "state": "online", 00:22:34.335 "raid_level": "raid1", 00:22:34.335 "superblock": true, 00:22:34.335 "num_base_bdevs": 4, 00:22:34.335 "num_base_bdevs_discovered": 2, 00:22:34.335 "num_base_bdevs_operational": 2, 00:22:34.335 "base_bdevs_list": [ 00:22:34.335 { 00:22:34.335 "name": null, 00:22:34.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.335 "is_configured": false, 00:22:34.335 "data_offset": 2048, 00:22:34.335 "data_size": 63488 00:22:34.335 }, 00:22:34.335 { 00:22:34.335 "name": null, 00:22:34.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.335 "is_configured": false, 00:22:34.335 "data_offset": 2048, 00:22:34.335 "data_size": 63488 00:22:34.335 }, 00:22:34.335 { 00:22:34.335 "name": "BaseBdev3", 00:22:34.335 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:34.335 "is_configured": true, 00:22:34.335 "data_offset": 2048, 00:22:34.335 "data_size": 63488 00:22:34.335 }, 00:22:34.335 { 00:22:34.335 "name": "BaseBdev4", 00:22:34.335 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:34.335 "is_configured": true, 00:22:34.335 "data_offset": 2048, 00:22:34.335 "data_size": 63488 00:22:34.335 } 00:22:34.335 ] 00:22:34.335 }' 00:22:34.335 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.335 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.905 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.165 "name": "raid_bdev1", 00:22:35.165 "uuid": "07a281bb-c41f-4fc5-92b1-c2fd7c4db784", 00:22:35.165 "strip_size_kb": 0, 00:22:35.165 "state": "online", 00:22:35.165 "raid_level": "raid1", 00:22:35.165 "superblock": true, 00:22:35.165 "num_base_bdevs": 4, 00:22:35.165 "num_base_bdevs_discovered": 2, 00:22:35.165 "num_base_bdevs_operational": 2, 00:22:35.165 "base_bdevs_list": [ 00:22:35.165 { 00:22:35.165 "name": null, 00:22:35.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.165 "is_configured": false, 00:22:35.165 "data_offset": 2048, 00:22:35.165 "data_size": 63488 00:22:35.165 }, 00:22:35.165 { 00:22:35.165 "name": null, 00:22:35.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.165 "is_configured": false, 00:22:35.165 "data_offset": 2048, 00:22:35.165 "data_size": 63488 00:22:35.165 }, 00:22:35.165 { 00:22:35.165 "name": "BaseBdev3", 00:22:35.165 "uuid": "e0b19090-66a9-5022-b2ab-7770bfa80a3b", 00:22:35.165 "is_configured": true, 00:22:35.165 "data_offset": 2048, 00:22:35.165 "data_size": 63488 00:22:35.165 }, 00:22:35.165 { 00:22:35.165 "name": "BaseBdev4", 00:22:35.165 "uuid": "782c5b72-24ce-5cae-b81c-8de52595c9cc", 00:22:35.165 "is_configured": true, 00:22:35.165 "data_offset": 2048, 00:22:35.165 "data_size": 63488 00:22:35.165 } 00:22:35.165 ] 00:22:35.165 }' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1090356 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1090356 ']' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 1090356 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1090356 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1090356' 00:22:35.165 killing process with pid 1090356 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 1090356 00:22:35.165 Received shutdown signal, test time was about 60.000000 seconds 00:22:35.165 00:22:35.165 Latency(us) 00:22:35.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:35.165 =================================================================================================================== 00:22:35.165 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:35.165 [2024-06-10 10:17:56.924489] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:35.165 [2024-06-10 10:17:56.924555] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:35.165 [2024-06-10 10:17:56.924595] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:35.165 [2024-06-10 10:17:56.924602] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd4ef0 name raid_bdev1, state offline 00:22:35.165 10:17:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 1090356 00:22:35.165 [2024-06-10 10:17:56.950842] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:22:35.425 00:22:35.425 real 0m34.764s 00:22:35.425 user 0m49.551s 00:22:35.425 sys 0m5.171s 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:35.425 ************************************ 00:22:35.425 END TEST raid_rebuild_test_sb 00:22:35.425 ************************************ 00:22:35.425 10:17:57 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:22:35.425 10:17:57 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:35.425 10:17:57 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:35.425 10:17:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:35.425 ************************************ 00:22:35.425 START TEST raid_rebuild_test_io 00:22:35.425 ************************************ 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false true true 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:35.425 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1096621 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1096621 /var/tmp/spdk-raid.sock 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 1096621 ']' 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:35.426 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:35.426 10:17:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:35.426 [2024-06-10 10:17:57.209469] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:22:35.426 [2024-06-10 10:17:57.209513] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096621 ] 00:22:35.426 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:35.426 Zero copy mechanism will not be used. 00:22:35.685 [2024-06-10 10:17:57.293997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:35.685 [2024-06-10 10:17:57.355982] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:35.685 [2024-06-10 10:17:57.397979] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:35.685 [2024-06-10 10:17:57.398003] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:36.255 10:17:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:36.255 10:17:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:22:36.255 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:36.255 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:36.547 BaseBdev1_malloc 00:22:36.547 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:36.547 [2024-06-10 10:17:58.376558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:36.547 [2024-06-10 10:17:58.376593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.547 [2024-06-10 10:17:58.376606] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15aea50 00:22:36.547 [2024-06-10 10:17:58.376613] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.547 [2024-06-10 10:17:58.377921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.547 [2024-06-10 10:17:58.377943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:36.547 BaseBdev1 00:22:36.547 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:36.547 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:36.830 BaseBdev2_malloc 00:22:36.830 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:37.090 [2024-06-10 10:17:58.755542] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:37.090 [2024-06-10 10:17:58.755571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:37.090 [2024-06-10 10:17:58.755583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15af5a0 00:22:37.090 [2024-06-10 10:17:58.755590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:37.090 [2024-06-10 10:17:58.756775] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:37.090 [2024-06-10 10:17:58.756793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:37.090 BaseBdev2 00:22:37.090 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:37.090 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:37.090 BaseBdev3_malloc 00:22:37.349 10:17:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:37.349 [2024-06-10 10:17:59.138462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:37.349 [2024-06-10 10:17:59.138491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:37.349 [2024-06-10 10:17:59.138502] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175ba30 00:22:37.349 [2024-06-10 10:17:59.138508] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:37.349 [2024-06-10 10:17:59.139681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:37.349 [2024-06-10 10:17:59.139700] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:37.349 BaseBdev3 00:22:37.349 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:37.349 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:37.609 BaseBdev4_malloc 00:22:37.609 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:37.868 [2024-06-10 10:17:59.521401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:37.868 [2024-06-10 10:17:59.521428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:37.868 [2024-06-10 10:17:59.521438] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175a2c0 00:22:37.868 [2024-06-10 10:17:59.521444] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:37.868 [2024-06-10 10:17:59.522611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:37.868 [2024-06-10 10:17:59.522629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:37.868 BaseBdev4 00:22:37.869 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:37.869 spare_malloc 00:22:37.869 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:38.129 spare_delay 00:22:38.129 10:17:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:38.389 [2024-06-10 10:18:00.092746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:38.389 [2024-06-10 10:18:00.092782] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:38.389 [2024-06-10 10:18:00.092795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17602d0 00:22:38.389 [2024-06-10 10:18:00.092802] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:38.389 [2024-06-10 10:18:00.094067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:38.389 [2024-06-10 10:18:00.094087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:38.389 spare 00:22:38.390 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:38.651 [2024-06-10 10:18:00.285248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:38.651 [2024-06-10 10:18:00.286269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:38.651 [2024-06-10 10:18:00.286311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:38.651 [2024-06-10 10:18:00.286345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:38.651 [2024-06-10 10:18:00.286404] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16dfe20 00:22:38.651 [2024-06-10 10:18:00.286410] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:38.651 [2024-06-10 10:18:00.286577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e31d0 00:22:38.651 [2024-06-10 10:18:00.286689] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16dfe20 00:22:38.651 [2024-06-10 10:18:00.286695] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16dfe20 00:22:38.651 [2024-06-10 10:18:00.286777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.651 "name": "raid_bdev1", 00:22:38.651 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:38.651 "strip_size_kb": 0, 00:22:38.651 "state": "online", 00:22:38.651 "raid_level": "raid1", 00:22:38.651 "superblock": false, 00:22:38.651 "num_base_bdevs": 4, 00:22:38.651 "num_base_bdevs_discovered": 4, 00:22:38.651 "num_base_bdevs_operational": 4, 00:22:38.651 "base_bdevs_list": [ 00:22:38.651 { 00:22:38.651 "name": "BaseBdev1", 00:22:38.651 "uuid": "968b91e3-f2e2-5bf8-87ba-791ccd1734b1", 00:22:38.651 "is_configured": true, 00:22:38.651 "data_offset": 0, 00:22:38.651 "data_size": 65536 00:22:38.651 }, 00:22:38.651 { 00:22:38.651 "name": "BaseBdev2", 00:22:38.651 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:38.651 "is_configured": true, 00:22:38.651 "data_offset": 0, 00:22:38.651 "data_size": 65536 00:22:38.651 }, 00:22:38.651 { 00:22:38.651 "name": "BaseBdev3", 00:22:38.651 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:38.651 "is_configured": true, 00:22:38.651 "data_offset": 0, 00:22:38.651 "data_size": 65536 00:22:38.651 }, 00:22:38.651 { 00:22:38.651 "name": "BaseBdev4", 00:22:38.651 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:38.651 "is_configured": true, 00:22:38.651 "data_offset": 0, 00:22:38.651 "data_size": 65536 00:22:38.651 } 00:22:38.651 ] 00:22:38.651 }' 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.651 10:18:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:39.223 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:39.223 10:18:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:39.483 [2024-06-10 10:18:01.163666] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:39.483 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:39.483 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.483 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:39.742 [2024-06-10 10:18:01.461606] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e5830 00:22:39.742 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:39.742 Zero copy mechanism will not be used. 00:22:39.742 Running I/O for 60 seconds... 00:22:39.742 [2024-06-10 10:18:01.554796] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:39.742 [2024-06-10 10:18:01.554952] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16e5830 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.742 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.002 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.002 "name": "raid_bdev1", 00:22:40.002 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:40.002 "strip_size_kb": 0, 00:22:40.002 "state": "online", 00:22:40.002 "raid_level": "raid1", 00:22:40.002 "superblock": false, 00:22:40.002 "num_base_bdevs": 4, 00:22:40.002 "num_base_bdevs_discovered": 3, 00:22:40.002 "num_base_bdevs_operational": 3, 00:22:40.002 "base_bdevs_list": [ 00:22:40.002 { 00:22:40.002 "name": null, 00:22:40.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.002 "is_configured": false, 00:22:40.002 "data_offset": 0, 00:22:40.002 "data_size": 65536 00:22:40.002 }, 00:22:40.002 { 00:22:40.002 "name": "BaseBdev2", 00:22:40.002 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:40.002 "is_configured": true, 00:22:40.002 "data_offset": 0, 00:22:40.002 "data_size": 65536 00:22:40.002 }, 00:22:40.002 { 00:22:40.002 "name": "BaseBdev3", 00:22:40.002 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:40.002 "is_configured": true, 00:22:40.002 "data_offset": 0, 00:22:40.002 "data_size": 65536 00:22:40.002 }, 00:22:40.002 { 00:22:40.002 "name": "BaseBdev4", 00:22:40.002 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:40.002 "is_configured": true, 00:22:40.002 "data_offset": 0, 00:22:40.002 "data_size": 65536 00:22:40.002 } 00:22:40.002 ] 00:22:40.002 }' 00:22:40.002 10:18:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.002 10:18:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:40.573 10:18:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:40.834 [2024-06-10 10:18:02.495831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:40.834 10:18:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:40.834 [2024-06-10 10:18:02.537929] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1645cd0 00:22:40.834 [2024-06-10 10:18:02.539733] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:40.834 [2024-06-10 10:18:02.655277] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:40.834 [2024-06-10 10:18:02.656029] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:41.094 [2024-06-10 10:18:02.873053] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:41.094 [2024-06-10 10:18:02.873400] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:41.665 [2024-06-10 10:18:03.223275] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:41.665 [2024-06-10 10:18:03.425991] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:41.665 [2024-06-10 10:18:03.426128] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.926 [2024-06-10 10:18:03.674424] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:41.926 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.926 "name": "raid_bdev1", 00:22:41.926 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:41.926 "strip_size_kb": 0, 00:22:41.926 "state": "online", 00:22:41.926 "raid_level": "raid1", 00:22:41.926 "superblock": false, 00:22:41.926 "num_base_bdevs": 4, 00:22:41.926 "num_base_bdevs_discovered": 4, 00:22:41.926 "num_base_bdevs_operational": 4, 00:22:41.926 "process": { 00:22:41.926 "type": "rebuild", 00:22:41.926 "target": "spare", 00:22:41.926 "progress": { 00:22:41.926 "blocks": 14336, 00:22:41.926 "percent": 21 00:22:41.926 } 00:22:41.926 }, 00:22:41.926 "base_bdevs_list": [ 00:22:41.926 { 00:22:41.926 "name": "spare", 00:22:41.926 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:41.927 "is_configured": true, 00:22:41.927 "data_offset": 0, 00:22:41.927 "data_size": 65536 00:22:41.927 }, 00:22:41.927 { 00:22:41.927 "name": "BaseBdev2", 00:22:41.927 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:41.927 "is_configured": true, 00:22:41.927 "data_offset": 0, 00:22:41.927 "data_size": 65536 00:22:41.927 }, 00:22:41.927 { 00:22:41.927 "name": "BaseBdev3", 00:22:41.927 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:41.927 "is_configured": true, 00:22:41.927 "data_offset": 0, 00:22:41.927 "data_size": 65536 00:22:41.927 }, 00:22:41.927 { 00:22:41.927 "name": "BaseBdev4", 00:22:41.927 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:41.927 "is_configured": true, 00:22:41.927 "data_offset": 0, 00:22:41.927 "data_size": 65536 00:22:41.927 } 00:22:41.927 ] 00:22:41.927 }' 00:22:41.927 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:41.927 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:41.927 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:41.927 [2024-06-10 10:18:03.790926] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:42.187 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:42.187 10:18:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:42.187 [2024-06-10 10:18:03.981669] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.448 [2024-06-10 10:18:04.124128] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:42.448 [2024-06-10 10:18:04.140257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.448 [2024-06-10 10:18:04.140276] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.448 [2024-06-10 10:18:04.140282] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:42.448 [2024-06-10 10:18:04.157288] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16e5830 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.448 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.709 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.709 "name": "raid_bdev1", 00:22:42.709 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:42.709 "strip_size_kb": 0, 00:22:42.709 "state": "online", 00:22:42.709 "raid_level": "raid1", 00:22:42.709 "superblock": false, 00:22:42.709 "num_base_bdevs": 4, 00:22:42.709 "num_base_bdevs_discovered": 3, 00:22:42.709 "num_base_bdevs_operational": 3, 00:22:42.709 "base_bdevs_list": [ 00:22:42.709 { 00:22:42.709 "name": null, 00:22:42.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.709 "is_configured": false, 00:22:42.709 "data_offset": 0, 00:22:42.709 "data_size": 65536 00:22:42.709 }, 00:22:42.709 { 00:22:42.709 "name": "BaseBdev2", 00:22:42.709 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:42.709 "is_configured": true, 00:22:42.709 "data_offset": 0, 00:22:42.709 "data_size": 65536 00:22:42.709 }, 00:22:42.709 { 00:22:42.709 "name": "BaseBdev3", 00:22:42.709 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:42.709 "is_configured": true, 00:22:42.709 "data_offset": 0, 00:22:42.709 "data_size": 65536 00:22:42.709 }, 00:22:42.709 { 00:22:42.709 "name": "BaseBdev4", 00:22:42.709 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:42.709 "is_configured": true, 00:22:42.709 "data_offset": 0, 00:22:42.709 "data_size": 65536 00:22:42.709 } 00:22:42.709 ] 00:22:42.709 }' 00:22:42.709 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.709 10:18:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.281 10:18:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.281 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.281 "name": "raid_bdev1", 00:22:43.281 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:43.281 "strip_size_kb": 0, 00:22:43.281 "state": "online", 00:22:43.281 "raid_level": "raid1", 00:22:43.281 "superblock": false, 00:22:43.281 "num_base_bdevs": 4, 00:22:43.281 "num_base_bdevs_discovered": 3, 00:22:43.281 "num_base_bdevs_operational": 3, 00:22:43.281 "base_bdevs_list": [ 00:22:43.281 { 00:22:43.281 "name": null, 00:22:43.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.281 "is_configured": false, 00:22:43.281 "data_offset": 0, 00:22:43.281 "data_size": 65536 00:22:43.281 }, 00:22:43.281 { 00:22:43.281 "name": "BaseBdev2", 00:22:43.281 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:43.281 "is_configured": true, 00:22:43.281 "data_offset": 0, 00:22:43.281 "data_size": 65536 00:22:43.281 }, 00:22:43.281 { 00:22:43.281 "name": "BaseBdev3", 00:22:43.281 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:43.281 "is_configured": true, 00:22:43.281 "data_offset": 0, 00:22:43.281 "data_size": 65536 00:22:43.281 }, 00:22:43.281 { 00:22:43.281 "name": "BaseBdev4", 00:22:43.281 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:43.281 "is_configured": true, 00:22:43.281 "data_offset": 0, 00:22:43.281 "data_size": 65536 00:22:43.281 } 00:22:43.281 ] 00:22:43.281 }' 00:22:43.281 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.542 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:43.542 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.543 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:43.543 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:43.543 [2024-06-10 10:18:05.407878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:43.803 [2024-06-10 10:18:05.442806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1757710 00:22:43.803 [2024-06-10 10:18:05.443985] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:43.803 10:18:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:43.803 [2024-06-10 10:18:05.574434] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:43.803 [2024-06-10 10:18:05.574687] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:44.064 [2024-06-10 10:18:05.785171] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:44.064 [2024-06-10 10:18:05.785284] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:44.635 [2024-06-10 10:18:06.265517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:44.635 [2024-06-10 10:18:06.265675] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.635 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.896 [2024-06-10 10:18:06.640087] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.896 "name": "raid_bdev1", 00:22:44.896 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:44.896 "strip_size_kb": 0, 00:22:44.896 "state": "online", 00:22:44.896 "raid_level": "raid1", 00:22:44.896 "superblock": false, 00:22:44.896 "num_base_bdevs": 4, 00:22:44.896 "num_base_bdevs_discovered": 4, 00:22:44.896 "num_base_bdevs_operational": 4, 00:22:44.896 "process": { 00:22:44.896 "type": "rebuild", 00:22:44.896 "target": "spare", 00:22:44.896 "progress": { 00:22:44.896 "blocks": 14336, 00:22:44.896 "percent": 21 00:22:44.896 } 00:22:44.896 }, 00:22:44.896 "base_bdevs_list": [ 00:22:44.896 { 00:22:44.896 "name": "spare", 00:22:44.896 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:44.896 "is_configured": true, 00:22:44.896 "data_offset": 0, 00:22:44.896 "data_size": 65536 00:22:44.896 }, 00:22:44.896 { 00:22:44.896 "name": "BaseBdev2", 00:22:44.896 "uuid": "90a4a80d-6694-5092-901d-9a910eac9785", 00:22:44.896 "is_configured": true, 00:22:44.896 "data_offset": 0, 00:22:44.896 "data_size": 65536 00:22:44.896 }, 00:22:44.896 { 00:22:44.896 "name": "BaseBdev3", 00:22:44.896 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:44.896 "is_configured": true, 00:22:44.896 "data_offset": 0, 00:22:44.896 "data_size": 65536 00:22:44.896 }, 00:22:44.896 { 00:22:44.896 "name": "BaseBdev4", 00:22:44.896 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:44.896 "is_configured": true, 00:22:44.896 "data_offset": 0, 00:22:44.896 "data_size": 65536 00:22:44.896 } 00:22:44.896 ] 00:22:44.896 }' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:44.896 10:18:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:45.157 [2024-06-10 10:18:06.903402] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:45.157 [2024-06-10 10:18:06.976600] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:45.418 [2024-06-10 10:18:07.083286] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x16e5830 00:22:45.418 [2024-06-10 10:18:07.083304] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1757710 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.418 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.418 [2024-06-10 10:18:07.235291] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:45.679 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.679 "name": "raid_bdev1", 00:22:45.679 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:45.679 "strip_size_kb": 0, 00:22:45.679 "state": "online", 00:22:45.679 "raid_level": "raid1", 00:22:45.679 "superblock": false, 00:22:45.680 "num_base_bdevs": 4, 00:22:45.680 "num_base_bdevs_discovered": 3, 00:22:45.680 "num_base_bdevs_operational": 3, 00:22:45.680 "process": { 00:22:45.680 "type": "rebuild", 00:22:45.680 "target": "spare", 00:22:45.680 "progress": { 00:22:45.680 "blocks": 22528, 00:22:45.680 "percent": 34 00:22:45.680 } 00:22:45.680 }, 00:22:45.680 "base_bdevs_list": [ 00:22:45.680 { 00:22:45.680 "name": "spare", 00:22:45.680 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:45.680 "is_configured": true, 00:22:45.680 "data_offset": 0, 00:22:45.680 "data_size": 65536 00:22:45.680 }, 00:22:45.680 { 00:22:45.680 "name": null, 00:22:45.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.680 "is_configured": false, 00:22:45.680 "data_offset": 0, 00:22:45.680 "data_size": 65536 00:22:45.680 }, 00:22:45.680 { 00:22:45.680 "name": "BaseBdev3", 00:22:45.680 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:45.680 "is_configured": true, 00:22:45.680 "data_offset": 0, 00:22:45.680 "data_size": 65536 00:22:45.680 }, 00:22:45.680 { 00:22:45.680 "name": "BaseBdev4", 00:22:45.680 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:45.680 "is_configured": true, 00:22:45.680 "data_offset": 0, 00:22:45.680 "data_size": 65536 00:22:45.680 } 00:22:45.680 ] 00:22:45.680 }' 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=779 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.680 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.941 "name": "raid_bdev1", 00:22:45.941 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:45.941 "strip_size_kb": 0, 00:22:45.941 "state": "online", 00:22:45.941 "raid_level": "raid1", 00:22:45.941 "superblock": false, 00:22:45.941 "num_base_bdevs": 4, 00:22:45.941 "num_base_bdevs_discovered": 3, 00:22:45.941 "num_base_bdevs_operational": 3, 00:22:45.941 "process": { 00:22:45.941 "type": "rebuild", 00:22:45.941 "target": "spare", 00:22:45.941 "progress": { 00:22:45.941 "blocks": 26624, 00:22:45.941 "percent": 40 00:22:45.941 } 00:22:45.941 }, 00:22:45.941 "base_bdevs_list": [ 00:22:45.941 { 00:22:45.941 "name": "spare", 00:22:45.941 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:45.941 "is_configured": true, 00:22:45.941 "data_offset": 0, 00:22:45.941 "data_size": 65536 00:22:45.941 }, 00:22:45.941 { 00:22:45.941 "name": null, 00:22:45.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.941 "is_configured": false, 00:22:45.941 "data_offset": 0, 00:22:45.941 "data_size": 65536 00:22:45.941 }, 00:22:45.941 { 00:22:45.941 "name": "BaseBdev3", 00:22:45.941 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:45.941 "is_configured": true, 00:22:45.941 "data_offset": 0, 00:22:45.941 "data_size": 65536 00:22:45.941 }, 00:22:45.941 { 00:22:45.941 "name": "BaseBdev4", 00:22:45.941 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:45.941 "is_configured": true, 00:22:45.941 "data_offset": 0, 00:22:45.941 "data_size": 65536 00:22:45.941 } 00:22:45.941 ] 00:22:45.941 }' 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.941 [2024-06-10 10:18:07.688805] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:45.941 10:18:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:46.884 [2024-06-10 10:18:08.481286] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.884 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:47.145 "name": "raid_bdev1", 00:22:47.145 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:47.145 "strip_size_kb": 0, 00:22:47.145 "state": "online", 00:22:47.145 "raid_level": "raid1", 00:22:47.145 "superblock": false, 00:22:47.145 "num_base_bdevs": 4, 00:22:47.145 "num_base_bdevs_discovered": 3, 00:22:47.145 "num_base_bdevs_operational": 3, 00:22:47.145 "process": { 00:22:47.145 "type": "rebuild", 00:22:47.145 "target": "spare", 00:22:47.145 "progress": { 00:22:47.145 "blocks": 45056, 00:22:47.145 "percent": 68 00:22:47.145 } 00:22:47.145 }, 00:22:47.145 "base_bdevs_list": [ 00:22:47.145 { 00:22:47.145 "name": "spare", 00:22:47.145 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:47.145 "is_configured": true, 00:22:47.145 "data_offset": 0, 00:22:47.145 "data_size": 65536 00:22:47.145 }, 00:22:47.145 { 00:22:47.145 "name": null, 00:22:47.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.145 "is_configured": false, 00:22:47.145 "data_offset": 0, 00:22:47.145 "data_size": 65536 00:22:47.145 }, 00:22:47.145 { 00:22:47.145 "name": "BaseBdev3", 00:22:47.145 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:47.145 "is_configured": true, 00:22:47.145 "data_offset": 0, 00:22:47.145 "data_size": 65536 00:22:47.145 }, 00:22:47.145 { 00:22:47.145 "name": "BaseBdev4", 00:22:47.145 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:47.145 "is_configured": true, 00:22:47.145 "data_offset": 0, 00:22:47.145 "data_size": 65536 00:22:47.145 } 00:22:47.145 ] 00:22:47.145 }' 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.145 [2024-06-10 10:18:08.927998] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:47.145 10:18:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:47.716 [2024-06-10 10:18:09.384191] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.284 10:18:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.284 [2024-06-10 10:18:10.041216] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:48.284 [2024-06-10 10:18:10.147923] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:48.284 [2024-06-10 10:18:10.150288] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.543 "name": "raid_bdev1", 00:22:48.543 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:48.543 "strip_size_kb": 0, 00:22:48.543 "state": "online", 00:22:48.543 "raid_level": "raid1", 00:22:48.543 "superblock": false, 00:22:48.543 "num_base_bdevs": 4, 00:22:48.543 "num_base_bdevs_discovered": 3, 00:22:48.543 "num_base_bdevs_operational": 3, 00:22:48.543 "base_bdevs_list": [ 00:22:48.543 { 00:22:48.543 "name": "spare", 00:22:48.543 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:48.543 "is_configured": true, 00:22:48.543 "data_offset": 0, 00:22:48.543 "data_size": 65536 00:22:48.543 }, 00:22:48.543 { 00:22:48.543 "name": null, 00:22:48.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.543 "is_configured": false, 00:22:48.543 "data_offset": 0, 00:22:48.543 "data_size": 65536 00:22:48.543 }, 00:22:48.543 { 00:22:48.543 "name": "BaseBdev3", 00:22:48.543 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:48.543 "is_configured": true, 00:22:48.543 "data_offset": 0, 00:22:48.543 "data_size": 65536 00:22:48.543 }, 00:22:48.543 { 00:22:48.543 "name": "BaseBdev4", 00:22:48.543 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:48.543 "is_configured": true, 00:22:48.543 "data_offset": 0, 00:22:48.543 "data_size": 65536 00:22:48.543 } 00:22:48.543 ] 00:22:48.543 }' 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:48.543 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.544 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.544 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.803 "name": "raid_bdev1", 00:22:48.803 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:48.803 "strip_size_kb": 0, 00:22:48.803 "state": "online", 00:22:48.803 "raid_level": "raid1", 00:22:48.803 "superblock": false, 00:22:48.803 "num_base_bdevs": 4, 00:22:48.803 "num_base_bdevs_discovered": 3, 00:22:48.803 "num_base_bdevs_operational": 3, 00:22:48.803 "base_bdevs_list": [ 00:22:48.803 { 00:22:48.803 "name": "spare", 00:22:48.803 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:48.803 "is_configured": true, 00:22:48.803 "data_offset": 0, 00:22:48.803 "data_size": 65536 00:22:48.803 }, 00:22:48.803 { 00:22:48.803 "name": null, 00:22:48.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.803 "is_configured": false, 00:22:48.803 "data_offset": 0, 00:22:48.803 "data_size": 65536 00:22:48.803 }, 00:22:48.803 { 00:22:48.803 "name": "BaseBdev3", 00:22:48.803 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:48.803 "is_configured": true, 00:22:48.803 "data_offset": 0, 00:22:48.803 "data_size": 65536 00:22:48.803 }, 00:22:48.803 { 00:22:48.803 "name": "BaseBdev4", 00:22:48.803 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:48.803 "is_configured": true, 00:22:48.803 "data_offset": 0, 00:22:48.803 "data_size": 65536 00:22:48.803 } 00:22:48.803 ] 00:22:48.803 }' 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.803 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.063 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.063 "name": "raid_bdev1", 00:22:49.063 "uuid": "326666b7-72cd-42c5-a35d-a6a0395b47cd", 00:22:49.063 "strip_size_kb": 0, 00:22:49.063 "state": "online", 00:22:49.063 "raid_level": "raid1", 00:22:49.063 "superblock": false, 00:22:49.063 "num_base_bdevs": 4, 00:22:49.063 "num_base_bdevs_discovered": 3, 00:22:49.063 "num_base_bdevs_operational": 3, 00:22:49.063 "base_bdevs_list": [ 00:22:49.063 { 00:22:49.063 "name": "spare", 00:22:49.063 "uuid": "84a058dc-30d7-566c-989c-07f52793775f", 00:22:49.063 "is_configured": true, 00:22:49.063 "data_offset": 0, 00:22:49.063 "data_size": 65536 00:22:49.063 }, 00:22:49.063 { 00:22:49.063 "name": null, 00:22:49.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.063 "is_configured": false, 00:22:49.063 "data_offset": 0, 00:22:49.063 "data_size": 65536 00:22:49.063 }, 00:22:49.063 { 00:22:49.063 "name": "BaseBdev3", 00:22:49.063 "uuid": "29e59e20-79e9-5498-a0cd-fbec5d236094", 00:22:49.063 "is_configured": true, 00:22:49.063 "data_offset": 0, 00:22:49.063 "data_size": 65536 00:22:49.063 }, 00:22:49.063 { 00:22:49.063 "name": "BaseBdev4", 00:22:49.063 "uuid": "1ae52771-194a-513f-858f-f613a27a39bd", 00:22:49.063 "is_configured": true, 00:22:49.063 "data_offset": 0, 00:22:49.063 "data_size": 65536 00:22:49.063 } 00:22:49.063 ] 00:22:49.063 }' 00:22:49.063 10:18:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.064 10:18:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:49.633 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:49.633 [2024-06-10 10:18:11.456418] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:49.633 [2024-06-10 10:18:11.456439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:49.893 00:22:49.893 Latency(us) 00:22:49.893 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:49.893 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:49.893 raid_bdev1 : 10.07 104.47 313.42 0.00 0.00 12820.60 248.91 109697.18 00:22:49.893 =================================================================================================================== 00:22:49.893 Total : 104.47 313.42 0.00 0.00 12820.60 248.91 109697.18 00:22:49.893 [2024-06-10 10:18:11.559933] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.893 [2024-06-10 10:18:11.559958] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:49.893 [2024-06-10 10:18:11.560031] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:49.893 [2024-06-10 10:18:11.560037] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16dfe20 name raid_bdev1, state offline 00:22:49.893 0 00:22:49.893 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.893 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:50.153 /dev/nbd0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:50.153 1+0 records in 00:22:50.153 1+0 records out 00:22:50.153 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309802 s, 13.2 MB/s 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.153 10:18:11 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:22:50.413 /dev/nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:50.413 1+0 records in 00:22:50.413 1+0 records out 00:22:50.413 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250431 s, 16.4 MB/s 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:50.413 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.672 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:22:50.932 /dev/nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:50.932 1+0 records in 00:22:50.932 1+0 records out 00:22:50.932 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026207 s, 15.6 MB/s 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:50.932 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:51.192 10:18:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1096621 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 1096621 ']' 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 1096621 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1096621 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1096621' 00:22:51.451 killing process with pid 1096621 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 1096621 00:22:51.451 Received shutdown signal, test time was about 11.697250 seconds 00:22:51.451 00:22:51.451 Latency(us) 00:22:51.451 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:51.451 =================================================================================================================== 00:22:51.451 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:51.451 [2024-06-10 10:18:13.188062] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:51.451 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 1096621 00:22:51.451 [2024-06-10 10:18:13.210727] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:51.711 00:22:51.711 real 0m16.185s 00:22:51.711 user 0m25.220s 00:22:51.711 sys 0m2.216s 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:51.711 ************************************ 00:22:51.711 END TEST raid_rebuild_test_io 00:22:51.711 ************************************ 00:22:51.711 10:18:13 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:22:51.711 10:18:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:51.711 10:18:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:51.711 10:18:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:51.711 ************************************ 00:22:51.711 START TEST raid_rebuild_test_sb_io 00:22:51.711 ************************************ 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true true true 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1100189 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1100189 /var/tmp/spdk-raid.sock 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 1100189 ']' 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:51.711 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:51.711 10:18:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:51.711 [2024-06-10 10:18:13.486537] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:22:51.711 [2024-06-10 10:18:13.486592] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100189 ] 00:22:51.711 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:51.711 Zero copy mechanism will not be used. 00:22:51.711 [2024-06-10 10:18:13.576590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.971 [2024-06-10 10:18:13.643499] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.971 [2024-06-10 10:18:13.683788] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.971 [2024-06-10 10:18:13.683811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:52.540 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:52.540 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:22:52.540 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:52.540 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:52.799 BaseBdev1_malloc 00:22:52.799 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:53.059 [2024-06-10 10:18:14.685751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:53.059 [2024-06-10 10:18:14.685787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.059 [2024-06-10 10:18:14.685801] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24bea50 00:22:53.059 [2024-06-10 10:18:14.685808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.059 [2024-06-10 10:18:14.687123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.059 [2024-06-10 10:18:14.687144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:53.059 BaseBdev1 00:22:53.059 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.059 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:53.059 BaseBdev2_malloc 00:22:53.059 10:18:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:53.318 [2024-06-10 10:18:15.052735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:53.318 [2024-06-10 10:18:15.052762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.318 [2024-06-10 10:18:15.052775] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24bf5a0 00:22:53.318 [2024-06-10 10:18:15.052781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.318 [2024-06-10 10:18:15.053958] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.318 [2024-06-10 10:18:15.053976] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:53.318 BaseBdev2 00:22:53.319 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.319 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:53.578 BaseBdev3_malloc 00:22:53.578 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:53.578 [2024-06-10 10:18:15.431517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:53.578 [2024-06-10 10:18:15.431546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.578 [2024-06-10 10:18:15.431557] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x266ba30 00:22:53.578 [2024-06-10 10:18:15.431563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.578 [2024-06-10 10:18:15.432744] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.578 [2024-06-10 10:18:15.432762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:53.578 BaseBdev3 00:22:53.837 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.837 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:53.837 BaseBdev4_malloc 00:22:53.838 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:54.098 [2024-06-10 10:18:15.802341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:54.098 [2024-06-10 10:18:15.802367] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.098 [2024-06-10 10:18:15.802378] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x266a2c0 00:22:54.098 [2024-06-10 10:18:15.802384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.098 [2024-06-10 10:18:15.803548] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.098 [2024-06-10 10:18:15.803571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:54.098 BaseBdev4 00:22:54.098 10:18:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:54.358 spare_malloc 00:22:54.358 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:54.358 spare_delay 00:22:54.358 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:54.617 [2024-06-10 10:18:16.345605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:54.617 [2024-06-10 10:18:16.345634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.617 [2024-06-10 10:18:16.345646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26702d0 00:22:54.617 [2024-06-10 10:18:16.345653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.617 [2024-06-10 10:18:16.346842] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.617 [2024-06-10 10:18:16.346861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:54.617 spare 00:22:54.617 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:54.877 [2024-06-10 10:18:16.522075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:54.877 [2024-06-10 10:18:16.523082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:54.877 [2024-06-10 10:18:16.523124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:54.877 [2024-06-10 10:18:16.523158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:54.877 [2024-06-10 10:18:16.523301] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25efe20 00:22:54.877 [2024-06-10 10:18:16.523308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:54.877 [2024-06-10 10:18:16.523455] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25efdc0 00:22:54.877 [2024-06-10 10:18:16.523567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25efe20 00:22:54.877 [2024-06-10 10:18:16.523572] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25efe20 00:22:54.877 [2024-06-10 10:18:16.523638] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.877 "name": "raid_bdev1", 00:22:54.877 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:22:54.877 "strip_size_kb": 0, 00:22:54.877 "state": "online", 00:22:54.877 "raid_level": "raid1", 00:22:54.877 "superblock": true, 00:22:54.877 "num_base_bdevs": 4, 00:22:54.877 "num_base_bdevs_discovered": 4, 00:22:54.877 "num_base_bdevs_operational": 4, 00:22:54.877 "base_bdevs_list": [ 00:22:54.877 { 00:22:54.877 "name": "BaseBdev1", 00:22:54.877 "uuid": "95540a33-34e4-51f0-aa7d-aff06adcdbe9", 00:22:54.877 "is_configured": true, 00:22:54.877 "data_offset": 2048, 00:22:54.877 "data_size": 63488 00:22:54.877 }, 00:22:54.877 { 00:22:54.877 "name": "BaseBdev2", 00:22:54.877 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:22:54.877 "is_configured": true, 00:22:54.877 "data_offset": 2048, 00:22:54.877 "data_size": 63488 00:22:54.877 }, 00:22:54.877 { 00:22:54.877 "name": "BaseBdev3", 00:22:54.877 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:22:54.877 "is_configured": true, 00:22:54.877 "data_offset": 2048, 00:22:54.877 "data_size": 63488 00:22:54.877 }, 00:22:54.877 { 00:22:54.877 "name": "BaseBdev4", 00:22:54.877 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:22:54.877 "is_configured": true, 00:22:54.877 "data_offset": 2048, 00:22:54.877 "data_size": 63488 00:22:54.877 } 00:22:54.877 ] 00:22:54.877 }' 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.877 10:18:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:55.447 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:55.447 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:55.707 [2024-06-10 10:18:17.460629] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:55.707 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:55.707 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.707 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:55.967 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:55.967 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:55.967 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:55.967 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:55.967 [2024-06-10 10:18:17.750518] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24bd900 00:22:55.967 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:55.967 Zero copy mechanism will not be used. 00:22:55.967 Running I/O for 60 seconds... 00:22:56.227 [2024-06-10 10:18:17.851877] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:56.227 [2024-06-10 10:18:17.858330] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x24bd900 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.227 10:18:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.227 10:18:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.227 "name": "raid_bdev1", 00:22:56.227 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:22:56.227 "strip_size_kb": 0, 00:22:56.227 "state": "online", 00:22:56.227 "raid_level": "raid1", 00:22:56.227 "superblock": true, 00:22:56.227 "num_base_bdevs": 4, 00:22:56.227 "num_base_bdevs_discovered": 3, 00:22:56.227 "num_base_bdevs_operational": 3, 00:22:56.227 "base_bdevs_list": [ 00:22:56.227 { 00:22:56.227 "name": null, 00:22:56.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.227 "is_configured": false, 00:22:56.227 "data_offset": 2048, 00:22:56.227 "data_size": 63488 00:22:56.227 }, 00:22:56.227 { 00:22:56.227 "name": "BaseBdev2", 00:22:56.227 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:22:56.227 "is_configured": true, 00:22:56.227 "data_offset": 2048, 00:22:56.227 "data_size": 63488 00:22:56.227 }, 00:22:56.227 { 00:22:56.227 "name": "BaseBdev3", 00:22:56.227 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:22:56.227 "is_configured": true, 00:22:56.227 "data_offset": 2048, 00:22:56.227 "data_size": 63488 00:22:56.227 }, 00:22:56.227 { 00:22:56.227 "name": "BaseBdev4", 00:22:56.227 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:22:56.227 "is_configured": true, 00:22:56.227 "data_offset": 2048, 00:22:56.227 "data_size": 63488 00:22:56.227 } 00:22:56.227 ] 00:22:56.227 }' 00:22:56.227 10:18:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.227 10:18:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:56.797 10:18:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:57.057 [2024-06-10 10:18:18.817258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:57.057 [2024-06-10 10:18:18.881123] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2555cd0 00:22:57.057 10:18:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:57.057 [2024-06-10 10:18:18.882934] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:57.317 [2024-06-10 10:18:19.006989] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:57.317 [2024-06-10 10:18:19.007730] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:57.577 [2024-06-10 10:18:19.232007] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:57.577 [2024-06-10 10:18:19.232291] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:57.850 [2024-06-10 10:18:19.574383] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:58.171 [2024-06-10 10:18:19.726971] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:58.171 [2024-06-10 10:18:19.727397] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.171 10:18:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.432 [2024-06-10 10:18:20.044889] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.432 "name": "raid_bdev1", 00:22:58.432 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:22:58.432 "strip_size_kb": 0, 00:22:58.432 "state": "online", 00:22:58.432 "raid_level": "raid1", 00:22:58.432 "superblock": true, 00:22:58.432 "num_base_bdevs": 4, 00:22:58.432 "num_base_bdevs_discovered": 4, 00:22:58.432 "num_base_bdevs_operational": 4, 00:22:58.432 "process": { 00:22:58.432 "type": "rebuild", 00:22:58.432 "target": "spare", 00:22:58.432 "progress": { 00:22:58.432 "blocks": 14336, 00:22:58.432 "percent": 22 00:22:58.432 } 00:22:58.432 }, 00:22:58.432 "base_bdevs_list": [ 00:22:58.432 { 00:22:58.432 "name": "spare", 00:22:58.432 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:22:58.432 "is_configured": true, 00:22:58.432 "data_offset": 2048, 00:22:58.432 "data_size": 63488 00:22:58.432 }, 00:22:58.432 { 00:22:58.432 "name": "BaseBdev2", 00:22:58.432 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:22:58.432 "is_configured": true, 00:22:58.432 "data_offset": 2048, 00:22:58.432 "data_size": 63488 00:22:58.432 }, 00:22:58.432 { 00:22:58.432 "name": "BaseBdev3", 00:22:58.432 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:22:58.432 "is_configured": true, 00:22:58.432 "data_offset": 2048, 00:22:58.432 "data_size": 63488 00:22:58.432 }, 00:22:58.432 { 00:22:58.432 "name": "BaseBdev4", 00:22:58.432 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:22:58.432 "is_configured": true, 00:22:58.432 "data_offset": 2048, 00:22:58.432 "data_size": 63488 00:22:58.432 } 00:22:58.432 ] 00:22:58.432 }' 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:58.432 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:58.432 [2024-06-10 10:18:20.278641] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:58.693 [2024-06-10 10:18:20.350004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:58.693 [2024-06-10 10:18:20.511000] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:58.693 [2024-06-10 10:18:20.526665] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.693 [2024-06-10 10:18:20.526686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:58.693 [2024-06-10 10:18:20.526692] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:58.693 [2024-06-10 10:18:20.550144] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x24bd900 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.954 "name": "raid_bdev1", 00:22:58.954 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:22:58.954 "strip_size_kb": 0, 00:22:58.954 "state": "online", 00:22:58.954 "raid_level": "raid1", 00:22:58.954 "superblock": true, 00:22:58.954 "num_base_bdevs": 4, 00:22:58.954 "num_base_bdevs_discovered": 3, 00:22:58.954 "num_base_bdevs_operational": 3, 00:22:58.954 "base_bdevs_list": [ 00:22:58.954 { 00:22:58.954 "name": null, 00:22:58.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.954 "is_configured": false, 00:22:58.954 "data_offset": 2048, 00:22:58.954 "data_size": 63488 00:22:58.954 }, 00:22:58.954 { 00:22:58.954 "name": "BaseBdev2", 00:22:58.954 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:22:58.954 "is_configured": true, 00:22:58.954 "data_offset": 2048, 00:22:58.954 "data_size": 63488 00:22:58.954 }, 00:22:58.954 { 00:22:58.954 "name": "BaseBdev3", 00:22:58.954 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:22:58.954 "is_configured": true, 00:22:58.954 "data_offset": 2048, 00:22:58.954 "data_size": 63488 00:22:58.954 }, 00:22:58.954 { 00:22:58.954 "name": "BaseBdev4", 00:22:58.954 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:22:58.954 "is_configured": true, 00:22:58.954 "data_offset": 2048, 00:22:58.954 "data_size": 63488 00:22:58.954 } 00:22:58.954 ] 00:22:58.954 }' 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.954 10:18:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.527 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.788 "name": "raid_bdev1", 00:22:59.788 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:22:59.788 "strip_size_kb": 0, 00:22:59.788 "state": "online", 00:22:59.788 "raid_level": "raid1", 00:22:59.788 "superblock": true, 00:22:59.788 "num_base_bdevs": 4, 00:22:59.788 "num_base_bdevs_discovered": 3, 00:22:59.788 "num_base_bdevs_operational": 3, 00:22:59.788 "base_bdevs_list": [ 00:22:59.788 { 00:22:59.788 "name": null, 00:22:59.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.788 "is_configured": false, 00:22:59.788 "data_offset": 2048, 00:22:59.788 "data_size": 63488 00:22:59.788 }, 00:22:59.788 { 00:22:59.788 "name": "BaseBdev2", 00:22:59.788 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:22:59.788 "is_configured": true, 00:22:59.788 "data_offset": 2048, 00:22:59.788 "data_size": 63488 00:22:59.788 }, 00:22:59.788 { 00:22:59.788 "name": "BaseBdev3", 00:22:59.788 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:22:59.788 "is_configured": true, 00:22:59.788 "data_offset": 2048, 00:22:59.788 "data_size": 63488 00:22:59.788 }, 00:22:59.788 { 00:22:59.788 "name": "BaseBdev4", 00:22:59.788 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:22:59.788 "is_configured": true, 00:22:59.788 "data_offset": 2048, 00:22:59.788 "data_size": 63488 00:22:59.788 } 00:22:59.788 ] 00:22:59.788 }' 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:59.788 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:00.049 [2024-06-10 10:18:21.816932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:00.049 10:18:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:00.049 [2024-06-10 10:18:21.873703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24bdb60 00:23:00.049 [2024-06-10 10:18:21.874880] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:00.310 [2024-06-10 10:18:21.976841] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:00.310 [2024-06-10 10:18:21.977617] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:00.571 [2024-06-10 10:18:22.203649] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:00.571 [2024-06-10 10:18:22.203780] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:00.571 [2024-06-10 10:18:22.422449] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:00.831 [2024-06-10 10:18:22.552232] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:01.092 [2024-06-10 10:18:22.778961] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:01.092 [2024-06-10 10:18:22.779183] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.092 10:18:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.352 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.352 "name": "raid_bdev1", 00:23:01.352 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:01.352 "strip_size_kb": 0, 00:23:01.352 "state": "online", 00:23:01.352 "raid_level": "raid1", 00:23:01.352 "superblock": true, 00:23:01.352 "num_base_bdevs": 4, 00:23:01.352 "num_base_bdevs_discovered": 4, 00:23:01.352 "num_base_bdevs_operational": 4, 00:23:01.352 "process": { 00:23:01.352 "type": "rebuild", 00:23:01.352 "target": "spare", 00:23:01.352 "progress": { 00:23:01.352 "blocks": 16384, 00:23:01.352 "percent": 25 00:23:01.352 } 00:23:01.352 }, 00:23:01.352 "base_bdevs_list": [ 00:23:01.352 { 00:23:01.352 "name": "spare", 00:23:01.352 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:01.352 "is_configured": true, 00:23:01.352 "data_offset": 2048, 00:23:01.352 "data_size": 63488 00:23:01.352 }, 00:23:01.352 { 00:23:01.352 "name": "BaseBdev2", 00:23:01.352 "uuid": "395938c1-fa5f-5f9f-a211-4a9b0fd9c147", 00:23:01.353 "is_configured": true, 00:23:01.353 "data_offset": 2048, 00:23:01.353 "data_size": 63488 00:23:01.353 }, 00:23:01.353 { 00:23:01.353 "name": "BaseBdev3", 00:23:01.353 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:01.353 "is_configured": true, 00:23:01.353 "data_offset": 2048, 00:23:01.353 "data_size": 63488 00:23:01.353 }, 00:23:01.353 { 00:23:01.353 "name": "BaseBdev4", 00:23:01.353 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:01.353 "is_configured": true, 00:23:01.353 "data_offset": 2048, 00:23:01.353 "data_size": 63488 00:23:01.353 } 00:23:01.353 ] 00:23:01.353 }' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:01.353 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:01.353 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:01.613 [2024-06-10 10:18:23.333197] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:01.873 [2024-06-10 10:18:23.592306] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x24bd900 00:23:01.873 [2024-06-10 10:18:23.592326] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x24bdb60 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.873 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.134 "name": "raid_bdev1", 00:23:02.134 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:02.134 "strip_size_kb": 0, 00:23:02.134 "state": "online", 00:23:02.134 "raid_level": "raid1", 00:23:02.134 "superblock": true, 00:23:02.134 "num_base_bdevs": 4, 00:23:02.134 "num_base_bdevs_discovered": 3, 00:23:02.134 "num_base_bdevs_operational": 3, 00:23:02.134 "process": { 00:23:02.134 "type": "rebuild", 00:23:02.134 "target": "spare", 00:23:02.134 "progress": { 00:23:02.134 "blocks": 26624, 00:23:02.134 "percent": 41 00:23:02.134 } 00:23:02.134 }, 00:23:02.134 "base_bdevs_list": [ 00:23:02.134 { 00:23:02.134 "name": "spare", 00:23:02.134 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:02.134 "is_configured": true, 00:23:02.134 "data_offset": 2048, 00:23:02.134 "data_size": 63488 00:23:02.134 }, 00:23:02.134 { 00:23:02.134 "name": null, 00:23:02.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.134 "is_configured": false, 00:23:02.134 "data_offset": 2048, 00:23:02.134 "data_size": 63488 00:23:02.134 }, 00:23:02.134 { 00:23:02.134 "name": "BaseBdev3", 00:23:02.134 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:02.134 "is_configured": true, 00:23:02.134 "data_offset": 2048, 00:23:02.134 "data_size": 63488 00:23:02.134 }, 00:23:02.134 { 00:23:02.134 "name": "BaseBdev4", 00:23:02.134 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:02.134 "is_configured": true, 00:23:02.134 "data_offset": 2048, 00:23:02.134 "data_size": 63488 00:23:02.134 } 00:23:02.134 ] 00:23:02.134 }' 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=795 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.134 10:18:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.395 [2024-06-10 10:18:24.070086] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.395 "name": "raid_bdev1", 00:23:02.395 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:02.395 "strip_size_kb": 0, 00:23:02.395 "state": "online", 00:23:02.395 "raid_level": "raid1", 00:23:02.395 "superblock": true, 00:23:02.395 "num_base_bdevs": 4, 00:23:02.395 "num_base_bdevs_discovered": 3, 00:23:02.395 "num_base_bdevs_operational": 3, 00:23:02.395 "process": { 00:23:02.395 "type": "rebuild", 00:23:02.395 "target": "spare", 00:23:02.395 "progress": { 00:23:02.395 "blocks": 32768, 00:23:02.395 "percent": 51 00:23:02.395 } 00:23:02.395 }, 00:23:02.395 "base_bdevs_list": [ 00:23:02.395 { 00:23:02.395 "name": "spare", 00:23:02.395 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:02.395 "is_configured": true, 00:23:02.395 "data_offset": 2048, 00:23:02.395 "data_size": 63488 00:23:02.395 }, 00:23:02.395 { 00:23:02.395 "name": null, 00:23:02.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.395 "is_configured": false, 00:23:02.395 "data_offset": 2048, 00:23:02.395 "data_size": 63488 00:23:02.395 }, 00:23:02.395 { 00:23:02.395 "name": "BaseBdev3", 00:23:02.395 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:02.395 "is_configured": true, 00:23:02.395 "data_offset": 2048, 00:23:02.395 "data_size": 63488 00:23:02.395 }, 00:23:02.395 { 00:23:02.395 "name": "BaseBdev4", 00:23:02.395 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:02.395 "is_configured": true, 00:23:02.395 "data_offset": 2048, 00:23:02.395 "data_size": 63488 00:23:02.395 } 00:23:02.395 ] 00:23:02.395 }' 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.395 10:18:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:02.655 [2024-06-10 10:18:24.300737] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:02.914 [2024-06-10 10:18:24.648242] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:03.173 [2024-06-10 10:18:24.878264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:03.173 [2024-06-10 10:18:24.878796] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.433 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.693 [2024-06-10 10:18:25.332771] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:03.693 "name": "raid_bdev1", 00:23:03.693 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:03.693 "strip_size_kb": 0, 00:23:03.693 "state": "online", 00:23:03.693 "raid_level": "raid1", 00:23:03.693 "superblock": true, 00:23:03.693 "num_base_bdevs": 4, 00:23:03.693 "num_base_bdevs_discovered": 3, 00:23:03.693 "num_base_bdevs_operational": 3, 00:23:03.693 "process": { 00:23:03.693 "type": "rebuild", 00:23:03.693 "target": "spare", 00:23:03.693 "progress": { 00:23:03.693 "blocks": 51200, 00:23:03.693 "percent": 80 00:23:03.693 } 00:23:03.693 }, 00:23:03.693 "base_bdevs_list": [ 00:23:03.693 { 00:23:03.693 "name": "spare", 00:23:03.693 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:03.693 "is_configured": true, 00:23:03.693 "data_offset": 2048, 00:23:03.693 "data_size": 63488 00:23:03.693 }, 00:23:03.693 { 00:23:03.693 "name": null, 00:23:03.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.693 "is_configured": false, 00:23:03.693 "data_offset": 2048, 00:23:03.693 "data_size": 63488 00:23:03.693 }, 00:23:03.693 { 00:23:03.693 "name": "BaseBdev3", 00:23:03.693 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:03.693 "is_configured": true, 00:23:03.693 "data_offset": 2048, 00:23:03.693 "data_size": 63488 00:23:03.693 }, 00:23:03.693 { 00:23:03.693 "name": "BaseBdev4", 00:23:03.693 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:03.693 "is_configured": true, 00:23:03.693 "data_offset": 2048, 00:23:03.693 "data_size": 63488 00:23:03.693 } 00:23:03.693 ] 00:23:03.693 }' 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:03.693 10:18:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:03.693 [2024-06-10 10:18:25.550302] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:04.262 [2024-06-10 10:18:26.123338] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:04.522 [2024-06-10 10:18:26.230048] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:04.522 [2024-06-10 10:18:26.232481] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.782 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.783 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.043 "name": "raid_bdev1", 00:23:05.043 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:05.043 "strip_size_kb": 0, 00:23:05.043 "state": "online", 00:23:05.043 "raid_level": "raid1", 00:23:05.043 "superblock": true, 00:23:05.043 "num_base_bdevs": 4, 00:23:05.043 "num_base_bdevs_discovered": 3, 00:23:05.043 "num_base_bdevs_operational": 3, 00:23:05.043 "base_bdevs_list": [ 00:23:05.043 { 00:23:05.043 "name": "spare", 00:23:05.043 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:05.043 "is_configured": true, 00:23:05.043 "data_offset": 2048, 00:23:05.043 "data_size": 63488 00:23:05.043 }, 00:23:05.043 { 00:23:05.043 "name": null, 00:23:05.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.043 "is_configured": false, 00:23:05.043 "data_offset": 2048, 00:23:05.043 "data_size": 63488 00:23:05.043 }, 00:23:05.043 { 00:23:05.043 "name": "BaseBdev3", 00:23:05.043 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:05.043 "is_configured": true, 00:23:05.043 "data_offset": 2048, 00:23:05.043 "data_size": 63488 00:23:05.043 }, 00:23:05.043 { 00:23:05.043 "name": "BaseBdev4", 00:23:05.043 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:05.043 "is_configured": true, 00:23:05.043 "data_offset": 2048, 00:23:05.043 "data_size": 63488 00:23:05.043 } 00:23:05.043 ] 00:23:05.043 }' 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.043 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.303 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.303 "name": "raid_bdev1", 00:23:05.303 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:05.303 "strip_size_kb": 0, 00:23:05.303 "state": "online", 00:23:05.303 "raid_level": "raid1", 00:23:05.303 "superblock": true, 00:23:05.303 "num_base_bdevs": 4, 00:23:05.303 "num_base_bdevs_discovered": 3, 00:23:05.303 "num_base_bdevs_operational": 3, 00:23:05.303 "base_bdevs_list": [ 00:23:05.303 { 00:23:05.303 "name": "spare", 00:23:05.303 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:05.303 "is_configured": true, 00:23:05.303 "data_offset": 2048, 00:23:05.303 "data_size": 63488 00:23:05.303 }, 00:23:05.303 { 00:23:05.303 "name": null, 00:23:05.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.303 "is_configured": false, 00:23:05.303 "data_offset": 2048, 00:23:05.303 "data_size": 63488 00:23:05.303 }, 00:23:05.303 { 00:23:05.303 "name": "BaseBdev3", 00:23:05.303 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:05.303 "is_configured": true, 00:23:05.303 "data_offset": 2048, 00:23:05.303 "data_size": 63488 00:23:05.303 }, 00:23:05.303 { 00:23:05.303 "name": "BaseBdev4", 00:23:05.303 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:05.303 "is_configured": true, 00:23:05.303 "data_offset": 2048, 00:23:05.303 "data_size": 63488 00:23:05.303 } 00:23:05.303 ] 00:23:05.303 }' 00:23:05.303 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.303 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:05.303 10:18:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.303 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.562 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.562 "name": "raid_bdev1", 00:23:05.562 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:05.562 "strip_size_kb": 0, 00:23:05.562 "state": "online", 00:23:05.562 "raid_level": "raid1", 00:23:05.562 "superblock": true, 00:23:05.562 "num_base_bdevs": 4, 00:23:05.562 "num_base_bdevs_discovered": 3, 00:23:05.562 "num_base_bdevs_operational": 3, 00:23:05.562 "base_bdevs_list": [ 00:23:05.562 { 00:23:05.562 "name": "spare", 00:23:05.562 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:05.562 "is_configured": true, 00:23:05.562 "data_offset": 2048, 00:23:05.562 "data_size": 63488 00:23:05.562 }, 00:23:05.562 { 00:23:05.562 "name": null, 00:23:05.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.562 "is_configured": false, 00:23:05.562 "data_offset": 2048, 00:23:05.562 "data_size": 63488 00:23:05.562 }, 00:23:05.562 { 00:23:05.562 "name": "BaseBdev3", 00:23:05.562 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:05.562 "is_configured": true, 00:23:05.562 "data_offset": 2048, 00:23:05.562 "data_size": 63488 00:23:05.562 }, 00:23:05.562 { 00:23:05.562 "name": "BaseBdev4", 00:23:05.562 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:05.562 "is_configured": true, 00:23:05.562 "data_offset": 2048, 00:23:05.562 "data_size": 63488 00:23:05.562 } 00:23:05.562 ] 00:23:05.562 }' 00:23:05.562 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.562 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:06.133 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:06.133 [2024-06-10 10:18:27.942783] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:06.133 [2024-06-10 10:18:27.942803] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:06.133 00:23:06.133 Latency(us) 00:23:06.133 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:06.133 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:06.133 raid_bdev1 : 10.19 108.78 326.35 0.00 0.00 12445.83 252.06 109697.18 00:23:06.133 =================================================================================================================== 00:23:06.133 Total : 108.78 326.35 0.00 0.00 12445.83 252.06 109697.18 00:23:06.133 [2024-06-10 10:18:27.974093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.133 [2024-06-10 10:18:27.974118] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:06.133 [2024-06-10 10:18:27.974190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:06.133 [2024-06-10 10:18:27.974196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25efe20 name raid_bdev1, state offline 00:23:06.133 0 00:23:06.133 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.133 10:18:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:06.394 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:06.655 /dev/nbd0 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:06.655 1+0 records in 00:23:06.655 1+0 records out 00:23:06.655 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283367 s, 14.5 MB/s 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:06.655 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:06.916 /dev/nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:06.916 1+0 records in 00:23:06.916 1+0 records out 00:23:06.916 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251788 s, 16.3 MB/s 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:06.916 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:06.917 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:06.917 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:06.917 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:06.917 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.178 10:18:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:07.439 /dev/nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:07.439 1+0 records in 00:23:07.439 1+0 records out 00:23:07.439 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213646 s, 19.2 MB/s 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:07.439 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:07.700 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:07.962 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:08.222 [2024-06-10 10:18:29.933211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:08.222 [2024-06-10 10:18:29.933244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.222 [2024-06-10 10:18:29.933257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24bdfe0 00:23:08.222 [2024-06-10 10:18:29.933264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.222 [2024-06-10 10:18:29.934577] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.222 [2024-06-10 10:18:29.934598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:08.222 [2024-06-10 10:18:29.934664] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:08.222 [2024-06-10 10:18:29.934683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:08.222 [2024-06-10 10:18:29.934761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:08.222 [2024-06-10 10:18:29.934818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:08.223 spare 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.223 10:18:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.223 [2024-06-10 10:18:30.035120] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2669340 00:23:08.223 [2024-06-10 10:18:30.035133] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:08.223 [2024-06-10 10:18:30.035303] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x266cc00 00:23:08.223 [2024-06-10 10:18:30.035429] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2669340 00:23:08.223 [2024-06-10 10:18:30.035435] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2669340 00:23:08.223 [2024-06-10 10:18:30.035523] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.484 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.484 "name": "raid_bdev1", 00:23:08.484 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:08.484 "strip_size_kb": 0, 00:23:08.484 "state": "online", 00:23:08.484 "raid_level": "raid1", 00:23:08.484 "superblock": true, 00:23:08.484 "num_base_bdevs": 4, 00:23:08.484 "num_base_bdevs_discovered": 3, 00:23:08.484 "num_base_bdevs_operational": 3, 00:23:08.484 "base_bdevs_list": [ 00:23:08.484 { 00:23:08.484 "name": "spare", 00:23:08.484 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:08.484 "is_configured": true, 00:23:08.484 "data_offset": 2048, 00:23:08.484 "data_size": 63488 00:23:08.484 }, 00:23:08.484 { 00:23:08.484 "name": null, 00:23:08.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.484 "is_configured": false, 00:23:08.484 "data_offset": 2048, 00:23:08.484 "data_size": 63488 00:23:08.484 }, 00:23:08.484 { 00:23:08.484 "name": "BaseBdev3", 00:23:08.484 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:08.484 "is_configured": true, 00:23:08.484 "data_offset": 2048, 00:23:08.484 "data_size": 63488 00:23:08.484 }, 00:23:08.484 { 00:23:08.484 "name": "BaseBdev4", 00:23:08.484 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:08.484 "is_configured": true, 00:23:08.484 "data_offset": 2048, 00:23:08.484 "data_size": 63488 00:23:08.484 } 00:23:08.484 ] 00:23:08.484 }' 00:23:08.484 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.484 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.056 "name": "raid_bdev1", 00:23:09.056 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:09.056 "strip_size_kb": 0, 00:23:09.056 "state": "online", 00:23:09.056 "raid_level": "raid1", 00:23:09.056 "superblock": true, 00:23:09.056 "num_base_bdevs": 4, 00:23:09.056 "num_base_bdevs_discovered": 3, 00:23:09.056 "num_base_bdevs_operational": 3, 00:23:09.056 "base_bdevs_list": [ 00:23:09.056 { 00:23:09.056 "name": "spare", 00:23:09.056 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:09.056 "is_configured": true, 00:23:09.056 "data_offset": 2048, 00:23:09.056 "data_size": 63488 00:23:09.056 }, 00:23:09.056 { 00:23:09.056 "name": null, 00:23:09.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.056 "is_configured": false, 00:23:09.056 "data_offset": 2048, 00:23:09.056 "data_size": 63488 00:23:09.056 }, 00:23:09.056 { 00:23:09.056 "name": "BaseBdev3", 00:23:09.056 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:09.056 "is_configured": true, 00:23:09.056 "data_offset": 2048, 00:23:09.056 "data_size": 63488 00:23:09.056 }, 00:23:09.056 { 00:23:09.056 "name": "BaseBdev4", 00:23:09.056 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:09.056 "is_configured": true, 00:23:09.056 "data_offset": 2048, 00:23:09.056 "data_size": 63488 00:23:09.056 } 00:23:09.056 ] 00:23:09.056 }' 00:23:09.056 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.316 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:09.317 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.317 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:09.317 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.317 10:18:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:09.317 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:09.317 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:09.577 [2024-06-10 10:18:31.345036] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.578 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.838 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.838 "name": "raid_bdev1", 00:23:09.838 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:09.838 "strip_size_kb": 0, 00:23:09.838 "state": "online", 00:23:09.838 "raid_level": "raid1", 00:23:09.838 "superblock": true, 00:23:09.838 "num_base_bdevs": 4, 00:23:09.838 "num_base_bdevs_discovered": 2, 00:23:09.838 "num_base_bdevs_operational": 2, 00:23:09.838 "base_bdevs_list": [ 00:23:09.838 { 00:23:09.838 "name": null, 00:23:09.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.838 "is_configured": false, 00:23:09.838 "data_offset": 2048, 00:23:09.838 "data_size": 63488 00:23:09.838 }, 00:23:09.838 { 00:23:09.838 "name": null, 00:23:09.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.838 "is_configured": false, 00:23:09.838 "data_offset": 2048, 00:23:09.838 "data_size": 63488 00:23:09.838 }, 00:23:09.838 { 00:23:09.838 "name": "BaseBdev3", 00:23:09.838 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:09.838 "is_configured": true, 00:23:09.838 "data_offset": 2048, 00:23:09.838 "data_size": 63488 00:23:09.838 }, 00:23:09.838 { 00:23:09.838 "name": "BaseBdev4", 00:23:09.838 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:09.838 "is_configured": true, 00:23:09.838 "data_offset": 2048, 00:23:09.838 "data_size": 63488 00:23:09.838 } 00:23:09.838 ] 00:23:09.838 }' 00:23:09.838 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.838 10:18:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:10.408 10:18:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:10.408 [2024-06-10 10:18:32.259451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:10.408 [2024-06-10 10:18:32.259569] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:10.408 [2024-06-10 10:18:32.259578] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:10.408 [2024-06-10 10:18:32.259597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:10.408 [2024-06-10 10:18:32.262609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x266f450 00:23:10.408 [2024-06-10 10:18:32.264223] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:10.668 10:18:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.608 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.869 "name": "raid_bdev1", 00:23:11.869 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:11.869 "strip_size_kb": 0, 00:23:11.869 "state": "online", 00:23:11.869 "raid_level": "raid1", 00:23:11.869 "superblock": true, 00:23:11.869 "num_base_bdevs": 4, 00:23:11.869 "num_base_bdevs_discovered": 3, 00:23:11.869 "num_base_bdevs_operational": 3, 00:23:11.869 "process": { 00:23:11.869 "type": "rebuild", 00:23:11.869 "target": "spare", 00:23:11.869 "progress": { 00:23:11.869 "blocks": 22528, 00:23:11.869 "percent": 35 00:23:11.869 } 00:23:11.869 }, 00:23:11.869 "base_bdevs_list": [ 00:23:11.869 { 00:23:11.869 "name": "spare", 00:23:11.869 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:11.869 "is_configured": true, 00:23:11.869 "data_offset": 2048, 00:23:11.869 "data_size": 63488 00:23:11.869 }, 00:23:11.869 { 00:23:11.869 "name": null, 00:23:11.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.869 "is_configured": false, 00:23:11.869 "data_offset": 2048, 00:23:11.869 "data_size": 63488 00:23:11.869 }, 00:23:11.869 { 00:23:11.869 "name": "BaseBdev3", 00:23:11.869 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:11.869 "is_configured": true, 00:23:11.869 "data_offset": 2048, 00:23:11.869 "data_size": 63488 00:23:11.869 }, 00:23:11.869 { 00:23:11.869 "name": "BaseBdev4", 00:23:11.869 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:11.869 "is_configured": true, 00:23:11.869 "data_offset": 2048, 00:23:11.869 "data_size": 63488 00:23:11.869 } 00:23:11.869 ] 00:23:11.869 }' 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:11.869 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:11.869 [2024-06-10 10:18:33.716954] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:12.130 [2024-06-10 10:18:33.773060] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:12.130 [2024-06-10 10:18:33.773092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.130 [2024-06-10 10:18:33.773103] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:12.130 [2024-06-10 10:18:33.773107] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.130 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.391 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.391 "name": "raid_bdev1", 00:23:12.391 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:12.391 "strip_size_kb": 0, 00:23:12.391 "state": "online", 00:23:12.391 "raid_level": "raid1", 00:23:12.391 "superblock": true, 00:23:12.391 "num_base_bdevs": 4, 00:23:12.391 "num_base_bdevs_discovered": 2, 00:23:12.391 "num_base_bdevs_operational": 2, 00:23:12.391 "base_bdevs_list": [ 00:23:12.391 { 00:23:12.391 "name": null, 00:23:12.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.391 "is_configured": false, 00:23:12.391 "data_offset": 2048, 00:23:12.391 "data_size": 63488 00:23:12.391 }, 00:23:12.391 { 00:23:12.391 "name": null, 00:23:12.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.391 "is_configured": false, 00:23:12.391 "data_offset": 2048, 00:23:12.391 "data_size": 63488 00:23:12.391 }, 00:23:12.391 { 00:23:12.391 "name": "BaseBdev3", 00:23:12.391 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:12.391 "is_configured": true, 00:23:12.391 "data_offset": 2048, 00:23:12.391 "data_size": 63488 00:23:12.391 }, 00:23:12.391 { 00:23:12.391 "name": "BaseBdev4", 00:23:12.391 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:12.391 "is_configured": true, 00:23:12.391 "data_offset": 2048, 00:23:12.391 "data_size": 63488 00:23:12.391 } 00:23:12.391 ] 00:23:12.391 }' 00:23:12.391 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.391 10:18:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:12.963 10:18:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:12.963 [2024-06-10 10:18:34.723619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:12.963 [2024-06-10 10:18:34.723653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.963 [2024-06-10 10:18:34.723669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f2a70 00:23:12.963 [2024-06-10 10:18:34.723676] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.963 [2024-06-10 10:18:34.723980] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.963 [2024-06-10 10:18:34.723991] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:12.963 [2024-06-10 10:18:34.724052] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:12.963 [2024-06-10 10:18:34.724058] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:12.963 [2024-06-10 10:18:34.724064] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:12.963 [2024-06-10 10:18:34.724075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:12.963 [2024-06-10 10:18:34.727018] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x266f450 00:23:12.963 [2024-06-10 10:18:34.728119] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:12.963 spare 00:23:12.963 10:18:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.902 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.163 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.163 "name": "raid_bdev1", 00:23:14.163 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:14.163 "strip_size_kb": 0, 00:23:14.163 "state": "online", 00:23:14.163 "raid_level": "raid1", 00:23:14.163 "superblock": true, 00:23:14.163 "num_base_bdevs": 4, 00:23:14.163 "num_base_bdevs_discovered": 3, 00:23:14.163 "num_base_bdevs_operational": 3, 00:23:14.163 "process": { 00:23:14.163 "type": "rebuild", 00:23:14.163 "target": "spare", 00:23:14.163 "progress": { 00:23:14.163 "blocks": 22528, 00:23:14.163 "percent": 35 00:23:14.163 } 00:23:14.163 }, 00:23:14.163 "base_bdevs_list": [ 00:23:14.163 { 00:23:14.163 "name": "spare", 00:23:14.163 "uuid": "6148df33-709f-5d8f-b118-42334a663944", 00:23:14.163 "is_configured": true, 00:23:14.163 "data_offset": 2048, 00:23:14.163 "data_size": 63488 00:23:14.163 }, 00:23:14.163 { 00:23:14.163 "name": null, 00:23:14.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.163 "is_configured": false, 00:23:14.163 "data_offset": 2048, 00:23:14.163 "data_size": 63488 00:23:14.163 }, 00:23:14.163 { 00:23:14.163 "name": "BaseBdev3", 00:23:14.163 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:14.163 "is_configured": true, 00:23:14.163 "data_offset": 2048, 00:23:14.163 "data_size": 63488 00:23:14.163 }, 00:23:14.163 { 00:23:14.163 "name": "BaseBdev4", 00:23:14.163 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:14.163 "is_configured": true, 00:23:14.163 "data_offset": 2048, 00:23:14.163 "data_size": 63488 00:23:14.163 } 00:23:14.163 ] 00:23:14.163 }' 00:23:14.163 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.163 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.163 10:18:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:14.423 [2024-06-10 10:18:36.204966] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.423 [2024-06-10 10:18:36.237004] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:14.423 [2024-06-10 10:18:36.237033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.423 [2024-06-10 10:18:36.237043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.423 [2024-06-10 10:18:36.237047] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.423 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.683 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.683 "name": "raid_bdev1", 00:23:14.683 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:14.683 "strip_size_kb": 0, 00:23:14.683 "state": "online", 00:23:14.683 "raid_level": "raid1", 00:23:14.683 "superblock": true, 00:23:14.683 "num_base_bdevs": 4, 00:23:14.683 "num_base_bdevs_discovered": 2, 00:23:14.683 "num_base_bdevs_operational": 2, 00:23:14.683 "base_bdevs_list": [ 00:23:14.683 { 00:23:14.683 "name": null, 00:23:14.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.683 "is_configured": false, 00:23:14.683 "data_offset": 2048, 00:23:14.683 "data_size": 63488 00:23:14.683 }, 00:23:14.683 { 00:23:14.683 "name": null, 00:23:14.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.683 "is_configured": false, 00:23:14.683 "data_offset": 2048, 00:23:14.683 "data_size": 63488 00:23:14.683 }, 00:23:14.683 { 00:23:14.683 "name": "BaseBdev3", 00:23:14.683 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:14.683 "is_configured": true, 00:23:14.683 "data_offset": 2048, 00:23:14.683 "data_size": 63488 00:23:14.683 }, 00:23:14.683 { 00:23:14.683 "name": "BaseBdev4", 00:23:14.683 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:14.683 "is_configured": true, 00:23:14.683 "data_offset": 2048, 00:23:14.683 "data_size": 63488 00:23:14.683 } 00:23:14.683 ] 00:23:14.683 }' 00:23:14.683 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.683 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.253 10:18:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.513 "name": "raid_bdev1", 00:23:15.513 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:15.513 "strip_size_kb": 0, 00:23:15.513 "state": "online", 00:23:15.513 "raid_level": "raid1", 00:23:15.513 "superblock": true, 00:23:15.513 "num_base_bdevs": 4, 00:23:15.513 "num_base_bdevs_discovered": 2, 00:23:15.513 "num_base_bdevs_operational": 2, 00:23:15.513 "base_bdevs_list": [ 00:23:15.513 { 00:23:15.513 "name": null, 00:23:15.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.513 "is_configured": false, 00:23:15.513 "data_offset": 2048, 00:23:15.513 "data_size": 63488 00:23:15.513 }, 00:23:15.513 { 00:23:15.513 "name": null, 00:23:15.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.513 "is_configured": false, 00:23:15.513 "data_offset": 2048, 00:23:15.513 "data_size": 63488 00:23:15.513 }, 00:23:15.513 { 00:23:15.513 "name": "BaseBdev3", 00:23:15.513 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:15.513 "is_configured": true, 00:23:15.513 "data_offset": 2048, 00:23:15.513 "data_size": 63488 00:23:15.513 }, 00:23:15.513 { 00:23:15.513 "name": "BaseBdev4", 00:23:15.513 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:15.513 "is_configured": true, 00:23:15.513 "data_offset": 2048, 00:23:15.513 "data_size": 63488 00:23:15.513 } 00:23:15.513 ] 00:23:15.513 }' 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:15.513 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:15.773 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:15.773 [2024-06-10 10:18:37.608641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:15.773 [2024-06-10 10:18:37.608671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.773 [2024-06-10 10:18:37.608682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f33c0 00:23:15.773 [2024-06-10 10:18:37.608689] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.773 [2024-06-10 10:18:37.608971] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.773 [2024-06-10 10:18:37.608982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:15.773 [2024-06-10 10:18:37.609029] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:15.773 [2024-06-10 10:18:37.609036] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:15.773 [2024-06-10 10:18:37.609041] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:15.773 BaseBdev1 00:23:15.773 10:18:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.154 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.154 "name": "raid_bdev1", 00:23:17.154 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:17.154 "strip_size_kb": 0, 00:23:17.154 "state": "online", 00:23:17.154 "raid_level": "raid1", 00:23:17.154 "superblock": true, 00:23:17.154 "num_base_bdevs": 4, 00:23:17.155 "num_base_bdevs_discovered": 2, 00:23:17.155 "num_base_bdevs_operational": 2, 00:23:17.155 "base_bdevs_list": [ 00:23:17.155 { 00:23:17.155 "name": null, 00:23:17.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.155 "is_configured": false, 00:23:17.155 "data_offset": 2048, 00:23:17.155 "data_size": 63488 00:23:17.155 }, 00:23:17.155 { 00:23:17.155 "name": null, 00:23:17.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.155 "is_configured": false, 00:23:17.155 "data_offset": 2048, 00:23:17.155 "data_size": 63488 00:23:17.155 }, 00:23:17.155 { 00:23:17.155 "name": "BaseBdev3", 00:23:17.155 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:17.155 "is_configured": true, 00:23:17.155 "data_offset": 2048, 00:23:17.155 "data_size": 63488 00:23:17.155 }, 00:23:17.155 { 00:23:17.155 "name": "BaseBdev4", 00:23:17.155 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:17.155 "is_configured": true, 00:23:17.155 "data_offset": 2048, 00:23:17.155 "data_size": 63488 00:23:17.155 } 00:23:17.155 ] 00:23:17.155 }' 00:23:17.155 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.155 10:18:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.725 "name": "raid_bdev1", 00:23:17.725 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:17.725 "strip_size_kb": 0, 00:23:17.725 "state": "online", 00:23:17.725 "raid_level": "raid1", 00:23:17.725 "superblock": true, 00:23:17.725 "num_base_bdevs": 4, 00:23:17.725 "num_base_bdevs_discovered": 2, 00:23:17.725 "num_base_bdevs_operational": 2, 00:23:17.725 "base_bdevs_list": [ 00:23:17.725 { 00:23:17.725 "name": null, 00:23:17.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.725 "is_configured": false, 00:23:17.725 "data_offset": 2048, 00:23:17.725 "data_size": 63488 00:23:17.725 }, 00:23:17.725 { 00:23:17.725 "name": null, 00:23:17.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.725 "is_configured": false, 00:23:17.725 "data_offset": 2048, 00:23:17.725 "data_size": 63488 00:23:17.725 }, 00:23:17.725 { 00:23:17.725 "name": "BaseBdev3", 00:23:17.725 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:17.725 "is_configured": true, 00:23:17.725 "data_offset": 2048, 00:23:17.725 "data_size": 63488 00:23:17.725 }, 00:23:17.725 { 00:23:17.725 "name": "BaseBdev4", 00:23:17.725 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:17.725 "is_configured": true, 00:23:17.725 "data_offset": 2048, 00:23:17.725 "data_size": 63488 00:23:17.725 } 00:23:17.725 ] 00:23:17.725 }' 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.725 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:17.986 [2024-06-10 10:18:39.794418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:17.986 [2024-06-10 10:18:39.794514] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:17.986 [2024-06-10 10:18:39.794522] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:17.986 request: 00:23:17.986 { 00:23:17.986 "raid_bdev": "raid_bdev1", 00:23:17.986 "base_bdev": "BaseBdev1", 00:23:17.986 "method": "bdev_raid_add_base_bdev", 00:23:17.986 "req_id": 1 00:23:17.986 } 00:23:17.986 Got JSON-RPC error response 00:23:17.986 response: 00:23:17.986 { 00:23:17.986 "code": -22, 00:23:17.986 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:17.986 } 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:17.986 10:18:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.368 10:18:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.368 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.368 "name": "raid_bdev1", 00:23:19.368 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:19.368 "strip_size_kb": 0, 00:23:19.368 "state": "online", 00:23:19.368 "raid_level": "raid1", 00:23:19.368 "superblock": true, 00:23:19.368 "num_base_bdevs": 4, 00:23:19.368 "num_base_bdevs_discovered": 2, 00:23:19.368 "num_base_bdevs_operational": 2, 00:23:19.368 "base_bdevs_list": [ 00:23:19.368 { 00:23:19.368 "name": null, 00:23:19.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.369 "is_configured": false, 00:23:19.369 "data_offset": 2048, 00:23:19.369 "data_size": 63488 00:23:19.369 }, 00:23:19.369 { 00:23:19.369 "name": null, 00:23:19.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.369 "is_configured": false, 00:23:19.369 "data_offset": 2048, 00:23:19.369 "data_size": 63488 00:23:19.369 }, 00:23:19.369 { 00:23:19.369 "name": "BaseBdev3", 00:23:19.369 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:19.369 "is_configured": true, 00:23:19.369 "data_offset": 2048, 00:23:19.369 "data_size": 63488 00:23:19.369 }, 00:23:19.369 { 00:23:19.369 "name": "BaseBdev4", 00:23:19.369 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:19.369 "is_configured": true, 00:23:19.369 "data_offset": 2048, 00:23:19.369 "data_size": 63488 00:23:19.369 } 00:23:19.369 ] 00:23:19.369 }' 00:23:19.369 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.369 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.938 "name": "raid_bdev1", 00:23:19.938 "uuid": "08c59860-8582-4659-9722-2e269db3e407", 00:23:19.938 "strip_size_kb": 0, 00:23:19.938 "state": "online", 00:23:19.938 "raid_level": "raid1", 00:23:19.938 "superblock": true, 00:23:19.938 "num_base_bdevs": 4, 00:23:19.938 "num_base_bdevs_discovered": 2, 00:23:19.938 "num_base_bdevs_operational": 2, 00:23:19.938 "base_bdevs_list": [ 00:23:19.938 { 00:23:19.938 "name": null, 00:23:19.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.938 "is_configured": false, 00:23:19.938 "data_offset": 2048, 00:23:19.938 "data_size": 63488 00:23:19.938 }, 00:23:19.938 { 00:23:19.938 "name": null, 00:23:19.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.938 "is_configured": false, 00:23:19.938 "data_offset": 2048, 00:23:19.938 "data_size": 63488 00:23:19.938 }, 00:23:19.938 { 00:23:19.938 "name": "BaseBdev3", 00:23:19.938 "uuid": "717dc960-cc24-5cda-a286-e8777a36dfb0", 00:23:19.938 "is_configured": true, 00:23:19.938 "data_offset": 2048, 00:23:19.938 "data_size": 63488 00:23:19.938 }, 00:23:19.938 { 00:23:19.938 "name": "BaseBdev4", 00:23:19.938 "uuid": "4a9d7050-c4e7-556d-8c5d-db1326850777", 00:23:19.938 "is_configured": true, 00:23:19.938 "data_offset": 2048, 00:23:19.938 "data_size": 63488 00:23:19.938 } 00:23:19.938 ] 00:23:19.938 }' 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:19.938 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1100189 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 1100189 ']' 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 1100189 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1100189 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1100189' 00:23:20.228 killing process with pid 1100189 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 1100189 00:23:20.228 Received shutdown signal, test time was about 24.045397 seconds 00:23:20.228 00:23:20.228 Latency(us) 00:23:20.228 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:20.228 =================================================================================================================== 00:23:20.228 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:20.228 [2024-06-10 10:18:41.854168] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 1100189 00:23:20.228 [2024-06-10 10:18:41.854244] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:20.228 [2024-06-10 10:18:41.854287] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:20.228 [2024-06-10 10:18:41.854293] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2669340 name raid_bdev1, state offline 00:23:20.228 [2024-06-10 10:18:41.877642] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:20.228 00:23:20.228 real 0m28.587s 00:23:20.228 user 0m45.144s 00:23:20.228 sys 0m3.464s 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:20.228 10:18:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:20.228 ************************************ 00:23:20.228 END TEST raid_rebuild_test_sb_io 00:23:20.228 ************************************ 00:23:20.228 10:18:42 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:23:20.228 10:18:42 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:23:20.228 10:18:42 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:23:20.228 10:18:42 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:20.228 10:18:42 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:20.228 10:18:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:20.228 ************************************ 00:23:20.228 START TEST raid_state_function_test_sb_4k 00:23:20.228 ************************************ 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1105267 00:23:20.228 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1105267' 00:23:20.228 Process raid pid: 1105267 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1105267 /var/tmp/spdk-raid.sock 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 1105267 ']' 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:20.229 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:20.229 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:20.530 [2024-06-10 10:18:42.094922] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:23:20.530 [2024-06-10 10:18:42.094955] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:20.530 [2024-06-10 10:18:42.174036] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:20.530 [2024-06-10 10:18:42.236224] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:20.530 [2024-06-10 10:18:42.274647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.530 [2024-06-10 10:18:42.274667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.530 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:20.530 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:23:20.530 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:20.789 [2024-06-10 10:18:42.452016] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:20.789 [2024-06-10 10:18:42.452044] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:20.789 [2024-06-10 10:18:42.452049] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:20.789 [2024-06-10 10:18:42.452055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.789 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:21.048 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.048 "name": "Existed_Raid", 00:23:21.048 "uuid": "b711f13c-bf53-42aa-8ccf-4caa759b6804", 00:23:21.048 "strip_size_kb": 0, 00:23:21.048 "state": "configuring", 00:23:21.048 "raid_level": "raid1", 00:23:21.048 "superblock": true, 00:23:21.048 "num_base_bdevs": 2, 00:23:21.048 "num_base_bdevs_discovered": 0, 00:23:21.048 "num_base_bdevs_operational": 2, 00:23:21.048 "base_bdevs_list": [ 00:23:21.048 { 00:23:21.048 "name": "BaseBdev1", 00:23:21.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.048 "is_configured": false, 00:23:21.048 "data_offset": 0, 00:23:21.048 "data_size": 0 00:23:21.048 }, 00:23:21.048 { 00:23:21.048 "name": "BaseBdev2", 00:23:21.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.048 "is_configured": false, 00:23:21.048 "data_offset": 0, 00:23:21.048 "data_size": 0 00:23:21.048 } 00:23:21.048 ] 00:23:21.048 }' 00:23:21.048 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.048 10:18:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:21.617 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:21.617 [2024-06-10 10:18:43.358208] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:21.617 [2024-06-10 10:18:43.358225] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c67b00 name Existed_Raid, state configuring 00:23:21.617 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:21.876 [2024-06-10 10:18:43.538672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:21.876 [2024-06-10 10:18:43.538686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:21.876 [2024-06-10 10:18:43.538691] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:21.876 [2024-06-10 10:18:43.538696] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:21.876 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:23:21.876 [2024-06-10 10:18:43.737672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:21.876 BaseBdev1 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:22.136 10:18:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:22.396 [ 00:23:22.396 { 00:23:22.396 "name": "BaseBdev1", 00:23:22.396 "aliases": [ 00:23:22.396 "4d1cbd1b-e18a-41d7-b361-71c33b16798d" 00:23:22.396 ], 00:23:22.396 "product_name": "Malloc disk", 00:23:22.396 "block_size": 4096, 00:23:22.396 "num_blocks": 8192, 00:23:22.396 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:22.396 "assigned_rate_limits": { 00:23:22.396 "rw_ios_per_sec": 0, 00:23:22.396 "rw_mbytes_per_sec": 0, 00:23:22.396 "r_mbytes_per_sec": 0, 00:23:22.396 "w_mbytes_per_sec": 0 00:23:22.396 }, 00:23:22.396 "claimed": true, 00:23:22.396 "claim_type": "exclusive_write", 00:23:22.396 "zoned": false, 00:23:22.396 "supported_io_types": { 00:23:22.396 "read": true, 00:23:22.396 "write": true, 00:23:22.396 "unmap": true, 00:23:22.396 "write_zeroes": true, 00:23:22.396 "flush": true, 00:23:22.396 "reset": true, 00:23:22.396 "compare": false, 00:23:22.396 "compare_and_write": false, 00:23:22.396 "abort": true, 00:23:22.396 "nvme_admin": false, 00:23:22.396 "nvme_io": false 00:23:22.396 }, 00:23:22.396 "memory_domains": [ 00:23:22.396 { 00:23:22.396 "dma_device_id": "system", 00:23:22.396 "dma_device_type": 1 00:23:22.396 }, 00:23:22.396 { 00:23:22.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.396 "dma_device_type": 2 00:23:22.396 } 00:23:22.396 ], 00:23:22.396 "driver_specific": {} 00:23:22.396 } 00:23:22.396 ] 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.396 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:22.656 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.656 "name": "Existed_Raid", 00:23:22.656 "uuid": "c7aaf3f5-d7dd-4f38-aa50-5668c7c116a0", 00:23:22.656 "strip_size_kb": 0, 00:23:22.656 "state": "configuring", 00:23:22.656 "raid_level": "raid1", 00:23:22.656 "superblock": true, 00:23:22.656 "num_base_bdevs": 2, 00:23:22.656 "num_base_bdevs_discovered": 1, 00:23:22.656 "num_base_bdevs_operational": 2, 00:23:22.656 "base_bdevs_list": [ 00:23:22.656 { 00:23:22.656 "name": "BaseBdev1", 00:23:22.656 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:22.656 "is_configured": true, 00:23:22.656 "data_offset": 256, 00:23:22.656 "data_size": 7936 00:23:22.656 }, 00:23:22.656 { 00:23:22.656 "name": "BaseBdev2", 00:23:22.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.656 "is_configured": false, 00:23:22.656 "data_offset": 0, 00:23:22.656 "data_size": 0 00:23:22.656 } 00:23:22.656 ] 00:23:22.656 }' 00:23:22.656 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.656 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:23.225 10:18:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:23.225 [2024-06-10 10:18:45.040931] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:23.225 [2024-06-10 10:18:45.040954] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c673f0 name Existed_Raid, state configuring 00:23:23.225 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:23.485 [2024-06-10 10:18:45.229438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:23.485 [2024-06-10 10:18:45.230574] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:23.485 [2024-06-10 10:18:45.230597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.485 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:23.746 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.746 "name": "Existed_Raid", 00:23:23.746 "uuid": "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21", 00:23:23.746 "strip_size_kb": 0, 00:23:23.746 "state": "configuring", 00:23:23.746 "raid_level": "raid1", 00:23:23.746 "superblock": true, 00:23:23.746 "num_base_bdevs": 2, 00:23:23.746 "num_base_bdevs_discovered": 1, 00:23:23.746 "num_base_bdevs_operational": 2, 00:23:23.746 "base_bdevs_list": [ 00:23:23.746 { 00:23:23.746 "name": "BaseBdev1", 00:23:23.746 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:23.746 "is_configured": true, 00:23:23.746 "data_offset": 256, 00:23:23.746 "data_size": 7936 00:23:23.746 }, 00:23:23.746 { 00:23:23.746 "name": "BaseBdev2", 00:23:23.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:23.746 "is_configured": false, 00:23:23.746 "data_offset": 0, 00:23:23.746 "data_size": 0 00:23:23.746 } 00:23:23.746 ] 00:23:23.746 }' 00:23:23.746 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.746 10:18:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:24.316 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:23:24.316 [2024-06-10 10:18:46.180788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:24.316 [2024-06-10 10:18:46.180902] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c681c0 00:23:24.316 [2024-06-10 10:18:46.180911] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:24.316 [2024-06-10 10:18:46.181051] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e1b220 00:23:24.316 [2024-06-10 10:18:46.181140] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c681c0 00:23:24.316 [2024-06-10 10:18:46.181146] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c681c0 00:23:24.316 [2024-06-10 10:18:46.181217] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.316 BaseBdev2 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:24.576 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:24.837 [ 00:23:24.837 { 00:23:24.837 "name": "BaseBdev2", 00:23:24.837 "aliases": [ 00:23:24.837 "7e08113d-548c-4e43-836d-9707a35db39a" 00:23:24.837 ], 00:23:24.837 "product_name": "Malloc disk", 00:23:24.837 "block_size": 4096, 00:23:24.837 "num_blocks": 8192, 00:23:24.837 "uuid": "7e08113d-548c-4e43-836d-9707a35db39a", 00:23:24.837 "assigned_rate_limits": { 00:23:24.837 "rw_ios_per_sec": 0, 00:23:24.837 "rw_mbytes_per_sec": 0, 00:23:24.837 "r_mbytes_per_sec": 0, 00:23:24.837 "w_mbytes_per_sec": 0 00:23:24.837 }, 00:23:24.837 "claimed": true, 00:23:24.837 "claim_type": "exclusive_write", 00:23:24.837 "zoned": false, 00:23:24.837 "supported_io_types": { 00:23:24.837 "read": true, 00:23:24.837 "write": true, 00:23:24.837 "unmap": true, 00:23:24.837 "write_zeroes": true, 00:23:24.837 "flush": true, 00:23:24.837 "reset": true, 00:23:24.837 "compare": false, 00:23:24.837 "compare_and_write": false, 00:23:24.837 "abort": true, 00:23:24.837 "nvme_admin": false, 00:23:24.837 "nvme_io": false 00:23:24.837 }, 00:23:24.837 "memory_domains": [ 00:23:24.837 { 00:23:24.837 "dma_device_id": "system", 00:23:24.837 "dma_device_type": 1 00:23:24.837 }, 00:23:24.837 { 00:23:24.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.837 "dma_device_type": 2 00:23:24.837 } 00:23:24.837 ], 00:23:24.837 "driver_specific": {} 00:23:24.837 } 00:23:24.837 ] 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.837 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:25.098 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.098 "name": "Existed_Raid", 00:23:25.098 "uuid": "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21", 00:23:25.098 "strip_size_kb": 0, 00:23:25.098 "state": "online", 00:23:25.098 "raid_level": "raid1", 00:23:25.098 "superblock": true, 00:23:25.098 "num_base_bdevs": 2, 00:23:25.098 "num_base_bdevs_discovered": 2, 00:23:25.098 "num_base_bdevs_operational": 2, 00:23:25.098 "base_bdevs_list": [ 00:23:25.098 { 00:23:25.098 "name": "BaseBdev1", 00:23:25.098 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:25.098 "is_configured": true, 00:23:25.098 "data_offset": 256, 00:23:25.098 "data_size": 7936 00:23:25.098 }, 00:23:25.098 { 00:23:25.098 "name": "BaseBdev2", 00:23:25.098 "uuid": "7e08113d-548c-4e43-836d-9707a35db39a", 00:23:25.098 "is_configured": true, 00:23:25.098 "data_offset": 256, 00:23:25.098 "data_size": 7936 00:23:25.098 } 00:23:25.098 ] 00:23:25.098 }' 00:23:25.098 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.098 10:18:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:25.668 [2024-06-10 10:18:47.504317] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:25.668 "name": "Existed_Raid", 00:23:25.668 "aliases": [ 00:23:25.668 "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21" 00:23:25.668 ], 00:23:25.668 "product_name": "Raid Volume", 00:23:25.668 "block_size": 4096, 00:23:25.668 "num_blocks": 7936, 00:23:25.668 "uuid": "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21", 00:23:25.668 "assigned_rate_limits": { 00:23:25.668 "rw_ios_per_sec": 0, 00:23:25.668 "rw_mbytes_per_sec": 0, 00:23:25.668 "r_mbytes_per_sec": 0, 00:23:25.668 "w_mbytes_per_sec": 0 00:23:25.668 }, 00:23:25.668 "claimed": false, 00:23:25.668 "zoned": false, 00:23:25.668 "supported_io_types": { 00:23:25.668 "read": true, 00:23:25.668 "write": true, 00:23:25.668 "unmap": false, 00:23:25.668 "write_zeroes": true, 00:23:25.668 "flush": false, 00:23:25.668 "reset": true, 00:23:25.668 "compare": false, 00:23:25.668 "compare_and_write": false, 00:23:25.668 "abort": false, 00:23:25.668 "nvme_admin": false, 00:23:25.668 "nvme_io": false 00:23:25.668 }, 00:23:25.668 "memory_domains": [ 00:23:25.668 { 00:23:25.668 "dma_device_id": "system", 00:23:25.668 "dma_device_type": 1 00:23:25.668 }, 00:23:25.668 { 00:23:25.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.668 "dma_device_type": 2 00:23:25.668 }, 00:23:25.668 { 00:23:25.668 "dma_device_id": "system", 00:23:25.668 "dma_device_type": 1 00:23:25.668 }, 00:23:25.668 { 00:23:25.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.668 "dma_device_type": 2 00:23:25.668 } 00:23:25.668 ], 00:23:25.668 "driver_specific": { 00:23:25.668 "raid": { 00:23:25.668 "uuid": "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21", 00:23:25.668 "strip_size_kb": 0, 00:23:25.668 "state": "online", 00:23:25.668 "raid_level": "raid1", 00:23:25.668 "superblock": true, 00:23:25.668 "num_base_bdevs": 2, 00:23:25.668 "num_base_bdevs_discovered": 2, 00:23:25.668 "num_base_bdevs_operational": 2, 00:23:25.668 "base_bdevs_list": [ 00:23:25.668 { 00:23:25.668 "name": "BaseBdev1", 00:23:25.668 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:25.668 "is_configured": true, 00:23:25.668 "data_offset": 256, 00:23:25.668 "data_size": 7936 00:23:25.668 }, 00:23:25.668 { 00:23:25.668 "name": "BaseBdev2", 00:23:25.668 "uuid": "7e08113d-548c-4e43-836d-9707a35db39a", 00:23:25.668 "is_configured": true, 00:23:25.668 "data_offset": 256, 00:23:25.668 "data_size": 7936 00:23:25.668 } 00:23:25.668 ] 00:23:25.668 } 00:23:25.668 } 00:23:25.668 }' 00:23:25.668 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:25.928 BaseBdev2' 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:25.928 "name": "BaseBdev1", 00:23:25.928 "aliases": [ 00:23:25.928 "4d1cbd1b-e18a-41d7-b361-71c33b16798d" 00:23:25.928 ], 00:23:25.928 "product_name": "Malloc disk", 00:23:25.928 "block_size": 4096, 00:23:25.928 "num_blocks": 8192, 00:23:25.928 "uuid": "4d1cbd1b-e18a-41d7-b361-71c33b16798d", 00:23:25.928 "assigned_rate_limits": { 00:23:25.928 "rw_ios_per_sec": 0, 00:23:25.928 "rw_mbytes_per_sec": 0, 00:23:25.928 "r_mbytes_per_sec": 0, 00:23:25.928 "w_mbytes_per_sec": 0 00:23:25.928 }, 00:23:25.928 "claimed": true, 00:23:25.928 "claim_type": "exclusive_write", 00:23:25.928 "zoned": false, 00:23:25.928 "supported_io_types": { 00:23:25.928 "read": true, 00:23:25.928 "write": true, 00:23:25.928 "unmap": true, 00:23:25.928 "write_zeroes": true, 00:23:25.928 "flush": true, 00:23:25.928 "reset": true, 00:23:25.928 "compare": false, 00:23:25.928 "compare_and_write": false, 00:23:25.928 "abort": true, 00:23:25.928 "nvme_admin": false, 00:23:25.928 "nvme_io": false 00:23:25.928 }, 00:23:25.928 "memory_domains": [ 00:23:25.928 { 00:23:25.928 "dma_device_id": "system", 00:23:25.928 "dma_device_type": 1 00:23:25.928 }, 00:23:25.928 { 00:23:25.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.928 "dma_device_type": 2 00:23:25.928 } 00:23:25.928 ], 00:23:25.928 "driver_specific": {} 00:23:25.928 }' 00:23:25.928 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.188 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.189 10:18:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.189 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:26.189 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:26.448 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:26.448 "name": "BaseBdev2", 00:23:26.448 "aliases": [ 00:23:26.448 "7e08113d-548c-4e43-836d-9707a35db39a" 00:23:26.448 ], 00:23:26.448 "product_name": "Malloc disk", 00:23:26.448 "block_size": 4096, 00:23:26.448 "num_blocks": 8192, 00:23:26.448 "uuid": "7e08113d-548c-4e43-836d-9707a35db39a", 00:23:26.449 "assigned_rate_limits": { 00:23:26.449 "rw_ios_per_sec": 0, 00:23:26.449 "rw_mbytes_per_sec": 0, 00:23:26.449 "r_mbytes_per_sec": 0, 00:23:26.449 "w_mbytes_per_sec": 0 00:23:26.449 }, 00:23:26.449 "claimed": true, 00:23:26.449 "claim_type": "exclusive_write", 00:23:26.449 "zoned": false, 00:23:26.449 "supported_io_types": { 00:23:26.449 "read": true, 00:23:26.449 "write": true, 00:23:26.449 "unmap": true, 00:23:26.449 "write_zeroes": true, 00:23:26.449 "flush": true, 00:23:26.449 "reset": true, 00:23:26.449 "compare": false, 00:23:26.449 "compare_and_write": false, 00:23:26.449 "abort": true, 00:23:26.449 "nvme_admin": false, 00:23:26.449 "nvme_io": false 00:23:26.449 }, 00:23:26.449 "memory_domains": [ 00:23:26.449 { 00:23:26.449 "dma_device_id": "system", 00:23:26.449 "dma_device_type": 1 00:23:26.449 }, 00:23:26.449 { 00:23:26.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:26.449 "dma_device_type": 2 00:23:26.449 } 00:23:26.449 ], 00:23:26.449 "driver_specific": {} 00:23:26.449 }' 00:23:26.449 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.709 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.969 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:26.969 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.969 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.969 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.969 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:27.229 [2024-06-10 10:18:48.843557] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.229 10:18:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:27.229 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.229 "name": "Existed_Raid", 00:23:27.229 "uuid": "0b4fd08b-e4db-44aa-9a47-3cefd4b0cd21", 00:23:27.229 "strip_size_kb": 0, 00:23:27.229 "state": "online", 00:23:27.229 "raid_level": "raid1", 00:23:27.229 "superblock": true, 00:23:27.229 "num_base_bdevs": 2, 00:23:27.229 "num_base_bdevs_discovered": 1, 00:23:27.229 "num_base_bdevs_operational": 1, 00:23:27.229 "base_bdevs_list": [ 00:23:27.229 { 00:23:27.229 "name": null, 00:23:27.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.229 "is_configured": false, 00:23:27.229 "data_offset": 256, 00:23:27.229 "data_size": 7936 00:23:27.229 }, 00:23:27.229 { 00:23:27.229 "name": "BaseBdev2", 00:23:27.229 "uuid": "7e08113d-548c-4e43-836d-9707a35db39a", 00:23:27.229 "is_configured": true, 00:23:27.229 "data_offset": 256, 00:23:27.229 "data_size": 7936 00:23:27.229 } 00:23:27.229 ] 00:23:27.229 }' 00:23:27.229 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.229 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:27.798 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:27.798 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:27.798 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.798 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:28.059 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:28.059 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:28.059 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:28.319 [2024-06-10 10:18:49.974433] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:28.319 [2024-06-10 10:18:49.974494] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:28.319 [2024-06-10 10:18:49.980506] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:28.319 [2024-06-10 10:18:49.980529] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:28.319 [2024-06-10 10:18:49.980538] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c681c0 name Existed_Raid, state offline 00:23:28.319 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:28.319 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:28.319 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.319 10:18:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1105267 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 1105267 ']' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 1105267 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1105267 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1105267' 00:23:28.580 killing process with pid 1105267 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # kill 1105267 00:23:28.580 [2024-06-10 10:18:50.237059] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@973 -- # wait 1105267 00:23:28.580 [2024-06-10 10:18:50.237653] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:23:28.580 00:23:28.580 real 0m8.291s 00:23:28.580 user 0m15.475s 00:23:28.580 sys 0m1.303s 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:28.580 10:18:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:28.580 ************************************ 00:23:28.580 END TEST raid_state_function_test_sb_4k 00:23:28.580 ************************************ 00:23:28.580 10:18:50 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:23:28.580 10:18:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:23:28.580 10:18:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:28.580 10:18:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:28.580 ************************************ 00:23:28.580 START TEST raid_superblock_test_4k 00:23:28.580 ************************************ 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1106966 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1106966 /var/tmp/spdk-raid.sock 00:23:28.580 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@830 -- # '[' -z 1106966 ']' 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:28.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:28.581 10:18:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:28.842 [2024-06-10 10:18:50.499568] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:23:28.842 [2024-06-10 10:18:50.499630] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106966 ] 00:23:28.842 [2024-06-10 10:18:50.587325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:28.842 [2024-06-10 10:18:50.649568] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:28.842 [2024-06-10 10:18:50.692793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.842 [2024-06-10 10:18:50.692818] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@863 -- # return 0 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:23:29.784 malloc1 00:23:29.784 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:30.044 [2024-06-10 10:18:51.691476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:30.044 [2024-06-10 10:18:51.691508] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.044 [2024-06-10 10:18:51.691520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d79990 00:23:30.044 [2024-06-10 10:18:51.691527] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.044 [2024-06-10 10:18:51.692885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.044 [2024-06-10 10:18:51.692904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:30.044 pt1 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:23:30.044 malloc2 00:23:30.044 10:18:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:30.305 [2024-06-10 10:18:52.058427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:30.305 [2024-06-10 10:18:52.058454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.305 [2024-06-10 10:18:52.058465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d7a4e0 00:23:30.305 [2024-06-10 10:18:52.058471] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.305 [2024-06-10 10:18:52.059707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.305 [2024-06-10 10:18:52.059727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:30.305 pt2 00:23:30.305 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:30.305 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:30.305 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:30.565 [2024-06-10 10:18:52.246909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:30.565 [2024-06-10 10:18:52.247912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:30.565 [2024-06-10 10:18:52.248024] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f22bc0 00:23:30.565 [2024-06-10 10:18:52.248032] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:30.565 [2024-06-10 10:18:52.248176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d78850 00:23:30.565 [2024-06-10 10:18:52.248282] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f22bc0 00:23:30.565 [2024-06-10 10:18:52.248288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f22bc0 00:23:30.565 [2024-06-10 10:18:52.248355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.565 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.826 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.826 "name": "raid_bdev1", 00:23:30.826 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:30.826 "strip_size_kb": 0, 00:23:30.826 "state": "online", 00:23:30.826 "raid_level": "raid1", 00:23:30.826 "superblock": true, 00:23:30.826 "num_base_bdevs": 2, 00:23:30.826 "num_base_bdevs_discovered": 2, 00:23:30.826 "num_base_bdevs_operational": 2, 00:23:30.826 "base_bdevs_list": [ 00:23:30.826 { 00:23:30.826 "name": "pt1", 00:23:30.826 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:30.826 "is_configured": true, 00:23:30.826 "data_offset": 256, 00:23:30.826 "data_size": 7936 00:23:30.826 }, 00:23:30.826 { 00:23:30.826 "name": "pt2", 00:23:30.826 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.826 "is_configured": true, 00:23:30.826 "data_offset": 256, 00:23:30.826 "data_size": 7936 00:23:30.826 } 00:23:30.826 ] 00:23:30.826 }' 00:23:30.826 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.826 10:18:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:31.395 10:18:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:31.395 [2024-06-10 10:18:53.169398] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:31.395 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:31.395 "name": "raid_bdev1", 00:23:31.395 "aliases": [ 00:23:31.395 "572b2e92-7781-43cf-8911-db358fd870bc" 00:23:31.395 ], 00:23:31.395 "product_name": "Raid Volume", 00:23:31.395 "block_size": 4096, 00:23:31.395 "num_blocks": 7936, 00:23:31.395 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:31.395 "assigned_rate_limits": { 00:23:31.395 "rw_ios_per_sec": 0, 00:23:31.395 "rw_mbytes_per_sec": 0, 00:23:31.395 "r_mbytes_per_sec": 0, 00:23:31.395 "w_mbytes_per_sec": 0 00:23:31.395 }, 00:23:31.395 "claimed": false, 00:23:31.395 "zoned": false, 00:23:31.395 "supported_io_types": { 00:23:31.395 "read": true, 00:23:31.395 "write": true, 00:23:31.395 "unmap": false, 00:23:31.395 "write_zeroes": true, 00:23:31.395 "flush": false, 00:23:31.395 "reset": true, 00:23:31.395 "compare": false, 00:23:31.396 "compare_and_write": false, 00:23:31.396 "abort": false, 00:23:31.396 "nvme_admin": false, 00:23:31.396 "nvme_io": false 00:23:31.396 }, 00:23:31.396 "memory_domains": [ 00:23:31.396 { 00:23:31.396 "dma_device_id": "system", 00:23:31.396 "dma_device_type": 1 00:23:31.396 }, 00:23:31.396 { 00:23:31.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.396 "dma_device_type": 2 00:23:31.396 }, 00:23:31.396 { 00:23:31.396 "dma_device_id": "system", 00:23:31.396 "dma_device_type": 1 00:23:31.396 }, 00:23:31.396 { 00:23:31.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.396 "dma_device_type": 2 00:23:31.396 } 00:23:31.396 ], 00:23:31.396 "driver_specific": { 00:23:31.396 "raid": { 00:23:31.396 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:31.396 "strip_size_kb": 0, 00:23:31.396 "state": "online", 00:23:31.396 "raid_level": "raid1", 00:23:31.396 "superblock": true, 00:23:31.396 "num_base_bdevs": 2, 00:23:31.396 "num_base_bdevs_discovered": 2, 00:23:31.396 "num_base_bdevs_operational": 2, 00:23:31.396 "base_bdevs_list": [ 00:23:31.396 { 00:23:31.396 "name": "pt1", 00:23:31.396 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.396 "is_configured": true, 00:23:31.396 "data_offset": 256, 00:23:31.396 "data_size": 7936 00:23:31.396 }, 00:23:31.396 { 00:23:31.396 "name": "pt2", 00:23:31.396 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:31.396 "is_configured": true, 00:23:31.396 "data_offset": 256, 00:23:31.396 "data_size": 7936 00:23:31.396 } 00:23:31.396 ] 00:23:31.396 } 00:23:31.396 } 00:23:31.396 }' 00:23:31.396 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:31.396 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:31.396 pt2' 00:23:31.396 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:31.396 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:31.396 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:31.656 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:31.656 "name": "pt1", 00:23:31.656 "aliases": [ 00:23:31.656 "00000000-0000-0000-0000-000000000001" 00:23:31.656 ], 00:23:31.656 "product_name": "passthru", 00:23:31.656 "block_size": 4096, 00:23:31.656 "num_blocks": 8192, 00:23:31.656 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.656 "assigned_rate_limits": { 00:23:31.656 "rw_ios_per_sec": 0, 00:23:31.656 "rw_mbytes_per_sec": 0, 00:23:31.656 "r_mbytes_per_sec": 0, 00:23:31.656 "w_mbytes_per_sec": 0 00:23:31.656 }, 00:23:31.656 "claimed": true, 00:23:31.656 "claim_type": "exclusive_write", 00:23:31.656 "zoned": false, 00:23:31.656 "supported_io_types": { 00:23:31.656 "read": true, 00:23:31.656 "write": true, 00:23:31.656 "unmap": true, 00:23:31.656 "write_zeroes": true, 00:23:31.656 "flush": true, 00:23:31.656 "reset": true, 00:23:31.656 "compare": false, 00:23:31.656 "compare_and_write": false, 00:23:31.656 "abort": true, 00:23:31.656 "nvme_admin": false, 00:23:31.656 "nvme_io": false 00:23:31.656 }, 00:23:31.656 "memory_domains": [ 00:23:31.656 { 00:23:31.656 "dma_device_id": "system", 00:23:31.656 "dma_device_type": 1 00:23:31.656 }, 00:23:31.656 { 00:23:31.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.656 "dma_device_type": 2 00:23:31.656 } 00:23:31.656 ], 00:23:31.656 "driver_specific": { 00:23:31.656 "passthru": { 00:23:31.656 "name": "pt1", 00:23:31.656 "base_bdev_name": "malloc1" 00:23:31.656 } 00:23:31.656 } 00:23:31.656 }' 00:23:31.656 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.656 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.656 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:31.656 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:31.917 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:32.178 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:32.178 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:32.178 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:32.178 "name": "pt2", 00:23:32.178 "aliases": [ 00:23:32.178 "00000000-0000-0000-0000-000000000002" 00:23:32.178 ], 00:23:32.178 "product_name": "passthru", 00:23:32.178 "block_size": 4096, 00:23:32.178 "num_blocks": 8192, 00:23:32.178 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:32.178 "assigned_rate_limits": { 00:23:32.178 "rw_ios_per_sec": 0, 00:23:32.178 "rw_mbytes_per_sec": 0, 00:23:32.178 "r_mbytes_per_sec": 0, 00:23:32.178 "w_mbytes_per_sec": 0 00:23:32.178 }, 00:23:32.178 "claimed": true, 00:23:32.178 "claim_type": "exclusive_write", 00:23:32.178 "zoned": false, 00:23:32.178 "supported_io_types": { 00:23:32.178 "read": true, 00:23:32.178 "write": true, 00:23:32.178 "unmap": true, 00:23:32.178 "write_zeroes": true, 00:23:32.178 "flush": true, 00:23:32.178 "reset": true, 00:23:32.178 "compare": false, 00:23:32.178 "compare_and_write": false, 00:23:32.178 "abort": true, 00:23:32.178 "nvme_admin": false, 00:23:32.178 "nvme_io": false 00:23:32.178 }, 00:23:32.178 "memory_domains": [ 00:23:32.178 { 00:23:32.178 "dma_device_id": "system", 00:23:32.178 "dma_device_type": 1 00:23:32.178 }, 00:23:32.178 { 00:23:32.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:32.178 "dma_device_type": 2 00:23:32.178 } 00:23:32.178 ], 00:23:32.178 "driver_specific": { 00:23:32.178 "passthru": { 00:23:32.178 "name": "pt2", 00:23:32.178 "base_bdev_name": "malloc2" 00:23:32.178 } 00:23:32.178 } 00:23:32.178 }' 00:23:32.178 10:18:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.178 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.438 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:32.698 [2024-06-10 10:18:54.500741] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=572b2e92-7781-43cf-8911-db358fd870bc 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 572b2e92-7781-43cf-8911-db358fd870bc ']' 00:23:32.698 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:32.959 [2024-06-10 10:18:54.689055] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:32.959 [2024-06-10 10:18:54.689064] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:32.959 [2024-06-10 10:18:54.689100] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:32.959 [2024-06-10 10:18:54.689136] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:32.959 [2024-06-10 10:18:54.689142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f22bc0 name raid_bdev1, state offline 00:23:32.959 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.959 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:33.219 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:33.219 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:33.219 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:33.219 10:18:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:33.479 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:33.479 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:33.479 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:33.479 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@649 -- # local es=0 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:33.739 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:33.999 [2024-06-10 10:18:55.643434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:33.999 [2024-06-10 10:18:55.644495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:33.999 [2024-06-10 10:18:55.644537] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:33.999 [2024-06-10 10:18:55.644564] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:33.999 [2024-06-10 10:18:55.644574] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:33.999 [2024-06-10 10:18:55.644580] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f23fa0 name raid_bdev1, state configuring 00:23:33.999 request: 00:23:33.999 { 00:23:33.999 "name": "raid_bdev1", 00:23:33.999 "raid_level": "raid1", 00:23:33.999 "base_bdevs": [ 00:23:33.999 "malloc1", 00:23:33.999 "malloc2" 00:23:33.999 ], 00:23:33.999 "superblock": false, 00:23:33.999 "method": "bdev_raid_create", 00:23:33.999 "req_id": 1 00:23:33.999 } 00:23:33.999 Got JSON-RPC error response 00:23:33.999 response: 00:23:33.999 { 00:23:33.999 "code": -17, 00:23:33.999 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:33.999 } 00:23:33.999 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # es=1 00:23:33.999 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:33.999 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:33.999 10:18:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:34.000 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.000 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:34.000 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:34.000 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:34.000 10:18:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:34.260 [2024-06-10 10:18:56.012328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:34.260 [2024-06-10 10:18:56.012350] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.260 [2024-06-10 10:18:56.012359] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f22900 00:23:34.260 [2024-06-10 10:18:56.012366] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.260 [2024-06-10 10:18:56.013653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.260 [2024-06-10 10:18:56.013672] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:34.260 [2024-06-10 10:18:56.013714] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:34.260 [2024-06-10 10:18:56.013730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:34.260 pt1 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.260 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.520 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.520 "name": "raid_bdev1", 00:23:34.520 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:34.520 "strip_size_kb": 0, 00:23:34.520 "state": "configuring", 00:23:34.520 "raid_level": "raid1", 00:23:34.520 "superblock": true, 00:23:34.520 "num_base_bdevs": 2, 00:23:34.520 "num_base_bdevs_discovered": 1, 00:23:34.520 "num_base_bdevs_operational": 2, 00:23:34.520 "base_bdevs_list": [ 00:23:34.520 { 00:23:34.520 "name": "pt1", 00:23:34.520 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:34.520 "is_configured": true, 00:23:34.520 "data_offset": 256, 00:23:34.520 "data_size": 7936 00:23:34.520 }, 00:23:34.520 { 00:23:34.520 "name": null, 00:23:34.520 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:34.520 "is_configured": false, 00:23:34.520 "data_offset": 256, 00:23:34.520 "data_size": 7936 00:23:34.520 } 00:23:34.520 ] 00:23:34.520 }' 00:23:34.520 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.520 10:18:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:35.089 [2024-06-10 10:18:56.906591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:35.089 [2024-06-10 10:18:56.906617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.089 [2024-06-10 10:18:56.906626] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2b190 00:23:35.089 [2024-06-10 10:18:56.906632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.089 [2024-06-10 10:18:56.906885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.089 [2024-06-10 10:18:56.906896] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:35.089 [2024-06-10 10:18:56.906933] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:35.089 [2024-06-10 10:18:56.906944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:35.089 [2024-06-10 10:18:56.907013] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2aba0 00:23:35.089 [2024-06-10 10:18:56.907019] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:35.089 [2024-06-10 10:18:56.907150] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f22b90 00:23:35.089 [2024-06-10 10:18:56.907248] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2aba0 00:23:35.089 [2024-06-10 10:18:56.907253] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f2aba0 00:23:35.089 [2024-06-10 10:18:56.907323] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.089 pt2 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.089 10:18:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.348 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:35.348 "name": "raid_bdev1", 00:23:35.348 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:35.348 "strip_size_kb": 0, 00:23:35.348 "state": "online", 00:23:35.348 "raid_level": "raid1", 00:23:35.348 "superblock": true, 00:23:35.348 "num_base_bdevs": 2, 00:23:35.348 "num_base_bdevs_discovered": 2, 00:23:35.348 "num_base_bdevs_operational": 2, 00:23:35.348 "base_bdevs_list": [ 00:23:35.348 { 00:23:35.348 "name": "pt1", 00:23:35.348 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:35.348 "is_configured": true, 00:23:35.348 "data_offset": 256, 00:23:35.348 "data_size": 7936 00:23:35.348 }, 00:23:35.348 { 00:23:35.348 "name": "pt2", 00:23:35.348 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:35.348 "is_configured": true, 00:23:35.348 "data_offset": 256, 00:23:35.348 "data_size": 7936 00:23:35.348 } 00:23:35.348 ] 00:23:35.348 }' 00:23:35.348 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:35.348 10:18:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:35.916 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:36.176 [2024-06-10 10:18:57.817062] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:36.176 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:36.176 "name": "raid_bdev1", 00:23:36.176 "aliases": [ 00:23:36.176 "572b2e92-7781-43cf-8911-db358fd870bc" 00:23:36.176 ], 00:23:36.176 "product_name": "Raid Volume", 00:23:36.176 "block_size": 4096, 00:23:36.177 "num_blocks": 7936, 00:23:36.177 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:36.177 "assigned_rate_limits": { 00:23:36.177 "rw_ios_per_sec": 0, 00:23:36.177 "rw_mbytes_per_sec": 0, 00:23:36.177 "r_mbytes_per_sec": 0, 00:23:36.177 "w_mbytes_per_sec": 0 00:23:36.177 }, 00:23:36.177 "claimed": false, 00:23:36.177 "zoned": false, 00:23:36.177 "supported_io_types": { 00:23:36.177 "read": true, 00:23:36.177 "write": true, 00:23:36.177 "unmap": false, 00:23:36.177 "write_zeroes": true, 00:23:36.177 "flush": false, 00:23:36.177 "reset": true, 00:23:36.177 "compare": false, 00:23:36.177 "compare_and_write": false, 00:23:36.177 "abort": false, 00:23:36.177 "nvme_admin": false, 00:23:36.177 "nvme_io": false 00:23:36.177 }, 00:23:36.177 "memory_domains": [ 00:23:36.177 { 00:23:36.177 "dma_device_id": "system", 00:23:36.177 "dma_device_type": 1 00:23:36.177 }, 00:23:36.177 { 00:23:36.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:36.177 "dma_device_type": 2 00:23:36.177 }, 00:23:36.177 { 00:23:36.177 "dma_device_id": "system", 00:23:36.177 "dma_device_type": 1 00:23:36.177 }, 00:23:36.177 { 00:23:36.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:36.177 "dma_device_type": 2 00:23:36.177 } 00:23:36.177 ], 00:23:36.177 "driver_specific": { 00:23:36.177 "raid": { 00:23:36.177 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:36.177 "strip_size_kb": 0, 00:23:36.177 "state": "online", 00:23:36.177 "raid_level": "raid1", 00:23:36.177 "superblock": true, 00:23:36.177 "num_base_bdevs": 2, 00:23:36.177 "num_base_bdevs_discovered": 2, 00:23:36.177 "num_base_bdevs_operational": 2, 00:23:36.177 "base_bdevs_list": [ 00:23:36.177 { 00:23:36.177 "name": "pt1", 00:23:36.177 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:36.177 "is_configured": true, 00:23:36.177 "data_offset": 256, 00:23:36.177 "data_size": 7936 00:23:36.177 }, 00:23:36.177 { 00:23:36.177 "name": "pt2", 00:23:36.177 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:36.177 "is_configured": true, 00:23:36.177 "data_offset": 256, 00:23:36.177 "data_size": 7936 00:23:36.177 } 00:23:36.177 ] 00:23:36.177 } 00:23:36.177 } 00:23:36.177 }' 00:23:36.177 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:36.177 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:36.177 pt2' 00:23:36.177 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:36.177 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:36.177 10:18:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:36.437 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:36.437 "name": "pt1", 00:23:36.437 "aliases": [ 00:23:36.437 "00000000-0000-0000-0000-000000000001" 00:23:36.437 ], 00:23:36.437 "product_name": "passthru", 00:23:36.437 "block_size": 4096, 00:23:36.437 "num_blocks": 8192, 00:23:36.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:36.437 "assigned_rate_limits": { 00:23:36.437 "rw_ios_per_sec": 0, 00:23:36.437 "rw_mbytes_per_sec": 0, 00:23:36.437 "r_mbytes_per_sec": 0, 00:23:36.437 "w_mbytes_per_sec": 0 00:23:36.437 }, 00:23:36.437 "claimed": true, 00:23:36.437 "claim_type": "exclusive_write", 00:23:36.437 "zoned": false, 00:23:36.437 "supported_io_types": { 00:23:36.437 "read": true, 00:23:36.437 "write": true, 00:23:36.437 "unmap": true, 00:23:36.437 "write_zeroes": true, 00:23:36.437 "flush": true, 00:23:36.437 "reset": true, 00:23:36.437 "compare": false, 00:23:36.437 "compare_and_write": false, 00:23:36.437 "abort": true, 00:23:36.437 "nvme_admin": false, 00:23:36.437 "nvme_io": false 00:23:36.437 }, 00:23:36.437 "memory_domains": [ 00:23:36.437 { 00:23:36.437 "dma_device_id": "system", 00:23:36.437 "dma_device_type": 1 00:23:36.437 }, 00:23:36.437 { 00:23:36.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:36.437 "dma_device_type": 2 00:23:36.437 } 00:23:36.437 ], 00:23:36.437 "driver_specific": { 00:23:36.437 "passthru": { 00:23:36.437 "name": "pt1", 00:23:36.437 "base_bdev_name": "malloc1" 00:23:36.437 } 00:23:36.437 } 00:23:36.437 }' 00:23:36.437 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:36.437 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:36.438 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:36.698 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:36.958 "name": "pt2", 00:23:36.958 "aliases": [ 00:23:36.958 "00000000-0000-0000-0000-000000000002" 00:23:36.958 ], 00:23:36.958 "product_name": "passthru", 00:23:36.958 "block_size": 4096, 00:23:36.958 "num_blocks": 8192, 00:23:36.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:36.958 "assigned_rate_limits": { 00:23:36.958 "rw_ios_per_sec": 0, 00:23:36.958 "rw_mbytes_per_sec": 0, 00:23:36.958 "r_mbytes_per_sec": 0, 00:23:36.958 "w_mbytes_per_sec": 0 00:23:36.958 }, 00:23:36.958 "claimed": true, 00:23:36.958 "claim_type": "exclusive_write", 00:23:36.958 "zoned": false, 00:23:36.958 "supported_io_types": { 00:23:36.958 "read": true, 00:23:36.958 "write": true, 00:23:36.958 "unmap": true, 00:23:36.958 "write_zeroes": true, 00:23:36.958 "flush": true, 00:23:36.958 "reset": true, 00:23:36.958 "compare": false, 00:23:36.958 "compare_and_write": false, 00:23:36.958 "abort": true, 00:23:36.958 "nvme_admin": false, 00:23:36.958 "nvme_io": false 00:23:36.958 }, 00:23:36.958 "memory_domains": [ 00:23:36.958 { 00:23:36.958 "dma_device_id": "system", 00:23:36.958 "dma_device_type": 1 00:23:36.958 }, 00:23:36.958 { 00:23:36.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:36.958 "dma_device_type": 2 00:23:36.958 } 00:23:36.958 ], 00:23:36.958 "driver_specific": { 00:23:36.958 "passthru": { 00:23:36.958 "name": "pt2", 00:23:36.958 "base_bdev_name": "malloc2" 00:23:36.958 } 00:23:36.958 } 00:23:36.958 }' 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:36.958 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:37.218 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:37.219 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:37.219 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:37.219 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:37.219 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:37.219 10:18:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:37.479 [2024-06-10 10:18:59.096301] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 572b2e92-7781-43cf-8911-db358fd870bc '!=' 572b2e92-7781-43cf-8911-db358fd870bc ']' 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:37.479 [2024-06-10 10:18:59.288637] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.479 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.740 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.740 "name": "raid_bdev1", 00:23:37.740 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:37.740 "strip_size_kb": 0, 00:23:37.740 "state": "online", 00:23:37.740 "raid_level": "raid1", 00:23:37.740 "superblock": true, 00:23:37.740 "num_base_bdevs": 2, 00:23:37.740 "num_base_bdevs_discovered": 1, 00:23:37.740 "num_base_bdevs_operational": 1, 00:23:37.740 "base_bdevs_list": [ 00:23:37.740 { 00:23:37.740 "name": null, 00:23:37.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.740 "is_configured": false, 00:23:37.740 "data_offset": 256, 00:23:37.740 "data_size": 7936 00:23:37.740 }, 00:23:37.740 { 00:23:37.740 "name": "pt2", 00:23:37.740 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:37.740 "is_configured": true, 00:23:37.740 "data_offset": 256, 00:23:37.740 "data_size": 7936 00:23:37.740 } 00:23:37.740 ] 00:23:37.740 }' 00:23:37.740 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.740 10:18:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:38.309 10:18:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:38.309 [2024-06-10 10:19:00.150797] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:38.309 [2024-06-10 10:19:00.150819] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:38.309 [2024-06-10 10:19:00.150858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:38.309 [2024-06-10 10:19:00.150887] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:38.309 [2024-06-10 10:19:00.150893] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2aba0 name raid_bdev1, state offline 00:23:38.309 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.309 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:38.569 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:38.569 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:38.569 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:38.569 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:38.569 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:23:38.829 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:39.089 [2024-06-10 10:19:00.732240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:39.089 [2024-06-10 10:19:00.732267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.089 [2024-06-10 10:19:00.732276] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f27660 00:23:39.090 [2024-06-10 10:19:00.732283] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.090 [2024-06-10 10:19:00.733588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.090 [2024-06-10 10:19:00.733606] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:39.090 [2024-06-10 10:19:00.733650] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:39.090 [2024-06-10 10:19:00.733667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:39.090 [2024-06-10 10:19:00.733727] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f23c10 00:23:39.090 [2024-06-10 10:19:00.733733] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:39.090 [2024-06-10 10:19:00.733875] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f28080 00:23:39.090 [2024-06-10 10:19:00.733970] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f23c10 00:23:39.090 [2024-06-10 10:19:00.733975] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f23c10 00:23:39.090 [2024-06-10 10:19:00.734045] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.090 pt2 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.090 "name": "raid_bdev1", 00:23:39.090 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:39.090 "strip_size_kb": 0, 00:23:39.090 "state": "online", 00:23:39.090 "raid_level": "raid1", 00:23:39.090 "superblock": true, 00:23:39.090 "num_base_bdevs": 2, 00:23:39.090 "num_base_bdevs_discovered": 1, 00:23:39.090 "num_base_bdevs_operational": 1, 00:23:39.090 "base_bdevs_list": [ 00:23:39.090 { 00:23:39.090 "name": null, 00:23:39.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.090 "is_configured": false, 00:23:39.090 "data_offset": 256, 00:23:39.090 "data_size": 7936 00:23:39.090 }, 00:23:39.090 { 00:23:39.090 "name": "pt2", 00:23:39.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:39.090 "is_configured": true, 00:23:39.090 "data_offset": 256, 00:23:39.090 "data_size": 7936 00:23:39.090 } 00:23:39.090 ] 00:23:39.090 }' 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.090 10:19:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:39.662 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:39.922 [2024-06-10 10:19:01.638509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:39.922 [2024-06-10 10:19:01.638525] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:39.922 [2024-06-10 10:19:01.638558] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:39.922 [2024-06-10 10:19:01.638586] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:39.922 [2024-06-10 10:19:01.638591] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f23c10 name raid_bdev1, state offline 00:23:39.922 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.922 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:40.183 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:40.183 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:40.183 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:40.183 10:19:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:40.183 [2024-06-10 10:19:02.015451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:40.183 [2024-06-10 10:19:02.015474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:40.184 [2024-06-10 10:19:02.015483] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2ae20 00:23:40.184 [2024-06-10 10:19:02.015489] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:40.184 [2024-06-10 10:19:02.016783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:40.184 [2024-06-10 10:19:02.016803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:40.184 [2024-06-10 10:19:02.016851] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:40.184 [2024-06-10 10:19:02.016867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:40.184 [2024-06-10 10:19:02.016938] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:40.184 [2024-06-10 10:19:02.016945] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.184 [2024-06-10 10:19:02.016952] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f282a0 name raid_bdev1, state configuring 00:23:40.184 [2024-06-10 10:19:02.016965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:40.184 [2024-06-10 10:19:02.017010] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f279d0 00:23:40.184 [2024-06-10 10:19:02.017016] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:40.184 [2024-06-10 10:19:02.017146] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f28350 00:23:40.184 [2024-06-10 10:19:02.017237] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f279d0 00:23:40.184 [2024-06-10 10:19:02.017242] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f279d0 00:23:40.184 [2024-06-10 10:19:02.017315] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.184 pt1 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.184 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.445 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.445 "name": "raid_bdev1", 00:23:40.445 "uuid": "572b2e92-7781-43cf-8911-db358fd870bc", 00:23:40.445 "strip_size_kb": 0, 00:23:40.445 "state": "online", 00:23:40.445 "raid_level": "raid1", 00:23:40.445 "superblock": true, 00:23:40.445 "num_base_bdevs": 2, 00:23:40.445 "num_base_bdevs_discovered": 1, 00:23:40.445 "num_base_bdevs_operational": 1, 00:23:40.445 "base_bdevs_list": [ 00:23:40.445 { 00:23:40.445 "name": null, 00:23:40.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.445 "is_configured": false, 00:23:40.445 "data_offset": 256, 00:23:40.445 "data_size": 7936 00:23:40.445 }, 00:23:40.445 { 00:23:40.445 "name": "pt2", 00:23:40.445 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:40.445 "is_configured": true, 00:23:40.445 "data_offset": 256, 00:23:40.445 "data_size": 7936 00:23:40.445 } 00:23:40.445 ] 00:23:40.445 }' 00:23:40.445 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.445 10:19:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:41.015 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:41.015 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:41.276 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:41.276 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:41.276 10:19:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:41.276 [2024-06-10 10:19:03.122397] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:41.276 10:19:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 572b2e92-7781-43cf-8911-db358fd870bc '!=' 572b2e92-7781-43cf-8911-db358fd870bc ']' 00:23:41.276 10:19:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1106966 00:23:41.276 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@949 -- # '[' -z 1106966 ']' 00:23:41.276 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # kill -0 1106966 00:23:41.276 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # uname 00:23:41.629 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:41.629 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1106966 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1106966' 00:23:41.630 killing process with pid 1106966 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # kill 1106966 00:23:41.630 [2024-06-10 10:19:03.190228] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:41.630 [2024-06-10 10:19:03.190266] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:41.630 [2024-06-10 10:19:03.190295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:41.630 [2024-06-10 10:19:03.190300] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f279d0 name raid_bdev1, state offline 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@973 -- # wait 1106966 00:23:41.630 [2024-06-10 10:19:03.199473] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:23:41.630 00:23:41.630 real 0m12.884s 00:23:41.630 user 0m23.846s 00:23:41.630 sys 0m1.987s 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:41.630 10:19:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:41.630 ************************************ 00:23:41.630 END TEST raid_superblock_test_4k 00:23:41.630 ************************************ 00:23:41.630 10:19:03 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:23:41.630 10:19:03 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:23:41.630 10:19:03 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:23:41.630 10:19:03 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:41.630 10:19:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:41.630 ************************************ 00:23:41.630 START TEST raid_rebuild_test_sb_4k 00:23:41.630 ************************************ 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1109452 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1109452 /var/tmp/spdk-raid.sock 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 1109452 ']' 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:41.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:41.630 10:19:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:41.630 [2024-06-10 10:19:03.450499] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:23:41.630 [2024-06-10 10:19:03.450544] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1109452 ] 00:23:41.630 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:41.630 Zero copy mechanism will not be used. 00:23:41.892 [2024-06-10 10:19:03.539117] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:41.892 [2024-06-10 10:19:03.601173] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:41.892 [2024-06-10 10:19:03.645902] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:41.892 [2024-06-10 10:19:03.645927] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:42.463 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:42.463 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:23:42.463 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:42.463 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:23:42.724 BaseBdev1_malloc 00:23:42.724 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:42.985 [2024-06-10 10:19:04.648158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:42.985 [2024-06-10 10:19:04.648195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.985 [2024-06-10 10:19:04.648207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e0a50 00:23:42.985 [2024-06-10 10:19:04.648214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.985 [2024-06-10 10:19:04.649490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.985 [2024-06-10 10:19:04.649509] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:42.985 BaseBdev1 00:23:42.985 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:42.985 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:23:42.985 BaseBdev2_malloc 00:23:43.246 10:19:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:43.246 [2024-06-10 10:19:05.035043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:43.246 [2024-06-10 10:19:05.035072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.246 [2024-06-10 10:19:05.035084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e15a0 00:23:43.246 [2024-06-10 10:19:05.035090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.246 [2024-06-10 10:19:05.036235] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.246 [2024-06-10 10:19:05.036254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:43.246 BaseBdev2 00:23:43.246 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:23:43.506 spare_malloc 00:23:43.506 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:43.766 spare_delay 00:23:43.766 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.766 [2024-06-10 10:19:05.614124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.766 [2024-06-10 10:19:05.614155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.766 [2024-06-10 10:19:05.614166] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x198f450 00:23:43.766 [2024-06-10 10:19:05.614172] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.766 [2024-06-10 10:19:05.615337] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.766 [2024-06-10 10:19:05.615355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.766 spare 00:23:43.766 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:44.026 [2024-06-10 10:19:05.794602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:44.026 [2024-06-10 10:19:05.795576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:44.026 [2024-06-10 10:19:05.795694] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x198e960 00:23:44.026 [2024-06-10 10:19:05.795702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:44.026 [2024-06-10 10:19:05.795852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17df530 00:23:44.026 [2024-06-10 10:19:05.795959] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x198e960 00:23:44.026 [2024-06-10 10:19:05.795965] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x198e960 00:23:44.026 [2024-06-10 10:19:05.796033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.026 10:19:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.286 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:44.286 "name": "raid_bdev1", 00:23:44.286 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:44.286 "strip_size_kb": 0, 00:23:44.286 "state": "online", 00:23:44.286 "raid_level": "raid1", 00:23:44.286 "superblock": true, 00:23:44.286 "num_base_bdevs": 2, 00:23:44.286 "num_base_bdevs_discovered": 2, 00:23:44.286 "num_base_bdevs_operational": 2, 00:23:44.286 "base_bdevs_list": [ 00:23:44.286 { 00:23:44.286 "name": "BaseBdev1", 00:23:44.286 "uuid": "5c6e651a-beaa-503b-b900-846da6e1815b", 00:23:44.286 "is_configured": true, 00:23:44.286 "data_offset": 256, 00:23:44.286 "data_size": 7936 00:23:44.286 }, 00:23:44.286 { 00:23:44.286 "name": "BaseBdev2", 00:23:44.286 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:44.286 "is_configured": true, 00:23:44.286 "data_offset": 256, 00:23:44.286 "data_size": 7936 00:23:44.286 } 00:23:44.286 ] 00:23:44.286 }' 00:23:44.286 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:44.286 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:44.857 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:44.857 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:44.857 [2024-06-10 10:19:06.705066] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.118 10:19:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:45.378 [2024-06-10 10:19:07.081870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19914f0 00:23:45.378 /dev/nbd0 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:45.378 1+0 records in 00:23:45.378 1+0 records out 00:23:45.378 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000179517 s, 22.8 MB/s 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:45.378 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:23:45.948 7936+0 records in 00:23:45.948 7936+0 records out 00:23:45.948 32505856 bytes (33 MB, 31 MiB) copied, 0.540554 s, 60.1 MB/s 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:45.948 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:46.209 [2024-06-10 10:19:07.867606] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:46.209 10:19:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:46.209 [2024-06-10 10:19:08.061158] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.470 "name": "raid_bdev1", 00:23:46.470 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:46.470 "strip_size_kb": 0, 00:23:46.470 "state": "online", 00:23:46.470 "raid_level": "raid1", 00:23:46.470 "superblock": true, 00:23:46.470 "num_base_bdevs": 2, 00:23:46.470 "num_base_bdevs_discovered": 1, 00:23:46.470 "num_base_bdevs_operational": 1, 00:23:46.470 "base_bdevs_list": [ 00:23:46.470 { 00:23:46.470 "name": null, 00:23:46.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.470 "is_configured": false, 00:23:46.470 "data_offset": 256, 00:23:46.470 "data_size": 7936 00:23:46.470 }, 00:23:46.470 { 00:23:46.470 "name": "BaseBdev2", 00:23:46.470 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:46.470 "is_configured": true, 00:23:46.470 "data_offset": 256, 00:23:46.470 "data_size": 7936 00:23:46.470 } 00:23:46.470 ] 00:23:46.470 }' 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.470 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:47.041 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:47.041 [2024-06-10 10:19:08.903283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:47.041 [2024-06-10 10:19:08.906613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17e0130 00:23:47.301 [2024-06-10 10:19:08.908185] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:47.301 10:19:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.240 10:19:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.500 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.500 "name": "raid_bdev1", 00:23:48.500 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:48.500 "strip_size_kb": 0, 00:23:48.500 "state": "online", 00:23:48.500 "raid_level": "raid1", 00:23:48.500 "superblock": true, 00:23:48.500 "num_base_bdevs": 2, 00:23:48.500 "num_base_bdevs_discovered": 2, 00:23:48.500 "num_base_bdevs_operational": 2, 00:23:48.500 "process": { 00:23:48.500 "type": "rebuild", 00:23:48.500 "target": "spare", 00:23:48.500 "progress": { 00:23:48.500 "blocks": 2816, 00:23:48.500 "percent": 35 00:23:48.500 } 00:23:48.500 }, 00:23:48.500 "base_bdevs_list": [ 00:23:48.500 { 00:23:48.500 "name": "spare", 00:23:48.500 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:48.500 "is_configured": true, 00:23:48.500 "data_offset": 256, 00:23:48.500 "data_size": 7936 00:23:48.500 }, 00:23:48.500 { 00:23:48.500 "name": "BaseBdev2", 00:23:48.500 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:48.500 "is_configured": true, 00:23:48.500 "data_offset": 256, 00:23:48.500 "data_size": 7936 00:23:48.500 } 00:23:48.500 ] 00:23:48.500 }' 00:23:48.500 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.500 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:48.501 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.501 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:48.501 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:48.761 [2024-06-10 10:19:10.380939] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.761 [2024-06-10 10:19:10.417011] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:48.761 [2024-06-10 10:19:10.417043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:48.761 [2024-06-10 10:19:10.417052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.761 [2024-06-10 10:19:10.417057] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.761 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.022 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.022 "name": "raid_bdev1", 00:23:49.022 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:49.022 "strip_size_kb": 0, 00:23:49.022 "state": "online", 00:23:49.022 "raid_level": "raid1", 00:23:49.022 "superblock": true, 00:23:49.022 "num_base_bdevs": 2, 00:23:49.022 "num_base_bdevs_discovered": 1, 00:23:49.022 "num_base_bdevs_operational": 1, 00:23:49.022 "base_bdevs_list": [ 00:23:49.022 { 00:23:49.022 "name": null, 00:23:49.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.022 "is_configured": false, 00:23:49.022 "data_offset": 256, 00:23:49.022 "data_size": 7936 00:23:49.022 }, 00:23:49.022 { 00:23:49.022 "name": "BaseBdev2", 00:23:49.022 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:49.022 "is_configured": true, 00:23:49.022 "data_offset": 256, 00:23:49.022 "data_size": 7936 00:23:49.022 } 00:23:49.022 ] 00:23:49.022 }' 00:23:49.022 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.022 10:19:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.282 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.542 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.542 "name": "raid_bdev1", 00:23:49.542 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:49.542 "strip_size_kb": 0, 00:23:49.542 "state": "online", 00:23:49.542 "raid_level": "raid1", 00:23:49.542 "superblock": true, 00:23:49.542 "num_base_bdevs": 2, 00:23:49.542 "num_base_bdevs_discovered": 1, 00:23:49.542 "num_base_bdevs_operational": 1, 00:23:49.542 "base_bdevs_list": [ 00:23:49.542 { 00:23:49.542 "name": null, 00:23:49.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.542 "is_configured": false, 00:23:49.542 "data_offset": 256, 00:23:49.542 "data_size": 7936 00:23:49.542 }, 00:23:49.542 { 00:23:49.542 "name": "BaseBdev2", 00:23:49.542 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:49.542 "is_configured": true, 00:23:49.542 "data_offset": 256, 00:23:49.542 "data_size": 7936 00:23:49.542 } 00:23:49.542 ] 00:23:49.542 }' 00:23:49.542 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.542 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:49.542 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.802 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:49.802 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:49.802 [2024-06-10 10:19:11.608073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.802 [2024-06-10 10:19:11.611384] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198b100 00:23:49.802 [2024-06-10 10:19:11.612509] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:49.802 10:19:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.188 "name": "raid_bdev1", 00:23:51.188 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:51.188 "strip_size_kb": 0, 00:23:51.188 "state": "online", 00:23:51.188 "raid_level": "raid1", 00:23:51.188 "superblock": true, 00:23:51.188 "num_base_bdevs": 2, 00:23:51.188 "num_base_bdevs_discovered": 2, 00:23:51.188 "num_base_bdevs_operational": 2, 00:23:51.188 "process": { 00:23:51.188 "type": "rebuild", 00:23:51.188 "target": "spare", 00:23:51.188 "progress": { 00:23:51.188 "blocks": 2816, 00:23:51.188 "percent": 35 00:23:51.188 } 00:23:51.188 }, 00:23:51.188 "base_bdevs_list": [ 00:23:51.188 { 00:23:51.188 "name": "spare", 00:23:51.188 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:51.188 "is_configured": true, 00:23:51.188 "data_offset": 256, 00:23:51.188 "data_size": 7936 00:23:51.188 }, 00:23:51.188 { 00:23:51.188 "name": "BaseBdev2", 00:23:51.188 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:51.188 "is_configured": true, 00:23:51.188 "data_offset": 256, 00:23:51.188 "data_size": 7936 00:23:51.188 } 00:23:51.188 ] 00:23:51.188 }' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:51.188 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=844 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.188 10:19:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.449 "name": "raid_bdev1", 00:23:51.449 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:51.449 "strip_size_kb": 0, 00:23:51.449 "state": "online", 00:23:51.449 "raid_level": "raid1", 00:23:51.449 "superblock": true, 00:23:51.449 "num_base_bdevs": 2, 00:23:51.449 "num_base_bdevs_discovered": 2, 00:23:51.449 "num_base_bdevs_operational": 2, 00:23:51.449 "process": { 00:23:51.449 "type": "rebuild", 00:23:51.449 "target": "spare", 00:23:51.449 "progress": { 00:23:51.449 "blocks": 3584, 00:23:51.449 "percent": 45 00:23:51.449 } 00:23:51.449 }, 00:23:51.449 "base_bdevs_list": [ 00:23:51.449 { 00:23:51.449 "name": "spare", 00:23:51.449 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:51.449 "is_configured": true, 00:23:51.449 "data_offset": 256, 00:23:51.449 "data_size": 7936 00:23:51.449 }, 00:23:51.449 { 00:23:51.449 "name": "BaseBdev2", 00:23:51.449 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:51.449 "is_configured": true, 00:23:51.449 "data_offset": 256, 00:23:51.449 "data_size": 7936 00:23:51.449 } 00:23:51.449 ] 00:23:51.449 }' 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.449 10:19:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.390 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.650 "name": "raid_bdev1", 00:23:52.650 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:52.650 "strip_size_kb": 0, 00:23:52.650 "state": "online", 00:23:52.650 "raid_level": "raid1", 00:23:52.650 "superblock": true, 00:23:52.650 "num_base_bdevs": 2, 00:23:52.650 "num_base_bdevs_discovered": 2, 00:23:52.650 "num_base_bdevs_operational": 2, 00:23:52.650 "process": { 00:23:52.650 "type": "rebuild", 00:23:52.650 "target": "spare", 00:23:52.650 "progress": { 00:23:52.650 "blocks": 6656, 00:23:52.650 "percent": 83 00:23:52.650 } 00:23:52.650 }, 00:23:52.650 "base_bdevs_list": [ 00:23:52.650 { 00:23:52.650 "name": "spare", 00:23:52.650 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:52.650 "is_configured": true, 00:23:52.650 "data_offset": 256, 00:23:52.650 "data_size": 7936 00:23:52.650 }, 00:23:52.650 { 00:23:52.650 "name": "BaseBdev2", 00:23:52.650 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:52.650 "is_configured": true, 00:23:52.650 "data_offset": 256, 00:23:52.650 "data_size": 7936 00:23:52.650 } 00:23:52.650 ] 00:23:52.650 }' 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.650 10:19:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:52.910 [2024-06-10 10:19:14.730590] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:52.910 [2024-06-10 10:19:14.730638] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:52.910 [2024-06-10 10:19:14.730702] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.850 "name": "raid_bdev1", 00:23:53.850 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:53.850 "strip_size_kb": 0, 00:23:53.850 "state": "online", 00:23:53.850 "raid_level": "raid1", 00:23:53.850 "superblock": true, 00:23:53.850 "num_base_bdevs": 2, 00:23:53.850 "num_base_bdevs_discovered": 2, 00:23:53.850 "num_base_bdevs_operational": 2, 00:23:53.850 "base_bdevs_list": [ 00:23:53.850 { 00:23:53.850 "name": "spare", 00:23:53.850 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:53.850 "is_configured": true, 00:23:53.850 "data_offset": 256, 00:23:53.850 "data_size": 7936 00:23:53.850 }, 00:23:53.850 { 00:23:53.850 "name": "BaseBdev2", 00:23:53.850 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:53.850 "is_configured": true, 00:23:53.850 "data_offset": 256, 00:23:53.850 "data_size": 7936 00:23:53.850 } 00:23:53.850 ] 00:23:53.850 }' 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.850 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.111 "name": "raid_bdev1", 00:23:54.111 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:54.111 "strip_size_kb": 0, 00:23:54.111 "state": "online", 00:23:54.111 "raid_level": "raid1", 00:23:54.111 "superblock": true, 00:23:54.111 "num_base_bdevs": 2, 00:23:54.111 "num_base_bdevs_discovered": 2, 00:23:54.111 "num_base_bdevs_operational": 2, 00:23:54.111 "base_bdevs_list": [ 00:23:54.111 { 00:23:54.111 "name": "spare", 00:23:54.111 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:54.111 "is_configured": true, 00:23:54.111 "data_offset": 256, 00:23:54.111 "data_size": 7936 00:23:54.111 }, 00:23:54.111 { 00:23:54.111 "name": "BaseBdev2", 00:23:54.111 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:54.111 "is_configured": true, 00:23:54.111 "data_offset": 256, 00:23:54.111 "data_size": 7936 00:23:54.111 } 00:23:54.111 ] 00:23:54.111 }' 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.111 10:19:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.371 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.371 "name": "raid_bdev1", 00:23:54.371 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:54.371 "strip_size_kb": 0, 00:23:54.371 "state": "online", 00:23:54.371 "raid_level": "raid1", 00:23:54.371 "superblock": true, 00:23:54.371 "num_base_bdevs": 2, 00:23:54.371 "num_base_bdevs_discovered": 2, 00:23:54.371 "num_base_bdevs_operational": 2, 00:23:54.371 "base_bdevs_list": [ 00:23:54.371 { 00:23:54.371 "name": "spare", 00:23:54.371 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:54.372 "is_configured": true, 00:23:54.372 "data_offset": 256, 00:23:54.372 "data_size": 7936 00:23:54.372 }, 00:23:54.372 { 00:23:54.372 "name": "BaseBdev2", 00:23:54.372 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:54.372 "is_configured": true, 00:23:54.372 "data_offset": 256, 00:23:54.372 "data_size": 7936 00:23:54.372 } 00:23:54.372 ] 00:23:54.372 }' 00:23:54.372 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.372 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:54.942 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:54.942 [2024-06-10 10:19:16.727834] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:54.942 [2024-06-10 10:19:16.727852] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:54.942 [2024-06-10 10:19:16.727895] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:54.942 [2024-06-10 10:19:16.727935] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:54.942 [2024-06-10 10:19:16.727941] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198e960 name raid_bdev1, state offline 00:23:54.942 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:23:54.942 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:55.203 10:19:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:55.463 /dev/nbd0 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:55.463 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:55.463 1+0 records in 00:23:55.463 1+0 records out 00:23:55.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000102943 s, 39.8 MB/s 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:55.464 /dev/nbd1 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:55.464 1+0 records in 00:23:55.464 1+0 records out 00:23:55.464 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315109 s, 13.0 MB/s 00:23:55.464 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:55.724 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:55.984 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:56.244 10:19:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:56.505 [2024-06-10 10:19:18.169297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:56.505 [2024-06-10 10:19:18.169332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.505 [2024-06-10 10:19:18.169345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x198c2c0 00:23:56.505 [2024-06-10 10:19:18.169352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.505 [2024-06-10 10:19:18.170658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.505 [2024-06-10 10:19:18.170679] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:56.505 [2024-06-10 10:19:18.170740] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:56.505 [2024-06-10 10:19:18.170759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:56.505 [2024-06-10 10:19:18.170841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:56.505 spare 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.505 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.505 [2024-06-10 10:19:18.271131] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17e5cf0 00:23:56.505 [2024-06-10 10:19:18.271139] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:56.505 [2024-06-10 10:19:18.271283] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17e32f0 00:23:56.505 [2024-06-10 10:19:18.271391] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17e5cf0 00:23:56.505 [2024-06-10 10:19:18.271397] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17e5cf0 00:23:56.505 [2024-06-10 10:19:18.271469] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:56.765 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.765 "name": "raid_bdev1", 00:23:56.765 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:56.765 "strip_size_kb": 0, 00:23:56.765 "state": "online", 00:23:56.765 "raid_level": "raid1", 00:23:56.765 "superblock": true, 00:23:56.765 "num_base_bdevs": 2, 00:23:56.765 "num_base_bdevs_discovered": 2, 00:23:56.765 "num_base_bdevs_operational": 2, 00:23:56.765 "base_bdevs_list": [ 00:23:56.765 { 00:23:56.765 "name": "spare", 00:23:56.765 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:56.765 "is_configured": true, 00:23:56.765 "data_offset": 256, 00:23:56.765 "data_size": 7936 00:23:56.765 }, 00:23:56.765 { 00:23:56.765 "name": "BaseBdev2", 00:23:56.765 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:56.765 "is_configured": true, 00:23:56.765 "data_offset": 256, 00:23:56.765 "data_size": 7936 00:23:56.765 } 00:23:56.765 ] 00:23:56.765 }' 00:23:56.765 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.765 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.025 10:19:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.286 "name": "raid_bdev1", 00:23:57.286 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:57.286 "strip_size_kb": 0, 00:23:57.286 "state": "online", 00:23:57.286 "raid_level": "raid1", 00:23:57.286 "superblock": true, 00:23:57.286 "num_base_bdevs": 2, 00:23:57.286 "num_base_bdevs_discovered": 2, 00:23:57.286 "num_base_bdevs_operational": 2, 00:23:57.286 "base_bdevs_list": [ 00:23:57.286 { 00:23:57.286 "name": "spare", 00:23:57.286 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:57.286 "is_configured": true, 00:23:57.286 "data_offset": 256, 00:23:57.286 "data_size": 7936 00:23:57.286 }, 00:23:57.286 { 00:23:57.286 "name": "BaseBdev2", 00:23:57.286 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:57.286 "is_configured": true, 00:23:57.286 "data_offset": 256, 00:23:57.286 "data_size": 7936 00:23:57.286 } 00:23:57.286 ] 00:23:57.286 }' 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.286 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:57.546 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:57.546 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:57.806 [2024-06-10 10:19:19.448608] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.806 "name": "raid_bdev1", 00:23:57.806 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:57.806 "strip_size_kb": 0, 00:23:57.806 "state": "online", 00:23:57.806 "raid_level": "raid1", 00:23:57.806 "superblock": true, 00:23:57.806 "num_base_bdevs": 2, 00:23:57.806 "num_base_bdevs_discovered": 1, 00:23:57.806 "num_base_bdevs_operational": 1, 00:23:57.806 "base_bdevs_list": [ 00:23:57.806 { 00:23:57.806 "name": null, 00:23:57.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.806 "is_configured": false, 00:23:57.806 "data_offset": 256, 00:23:57.806 "data_size": 7936 00:23:57.806 }, 00:23:57.806 { 00:23:57.806 "name": "BaseBdev2", 00:23:57.806 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:57.806 "is_configured": true, 00:23:57.806 "data_offset": 256, 00:23:57.806 "data_size": 7936 00:23:57.806 } 00:23:57.806 ] 00:23:57.806 }' 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.806 10:19:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:58.376 10:19:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:58.637 [2024-06-10 10:19:20.250660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:58.637 [2024-06-10 10:19:20.250786] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:58.637 [2024-06-10 10:19:20.250796] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:58.637 [2024-06-10 10:19:20.250814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:58.637 [2024-06-10 10:19:20.254011] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1990130 00:23:58.637 [2024-06-10 10:19:20.255624] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:58.637 10:19:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.577 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:59.838 "name": "raid_bdev1", 00:23:59.838 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:23:59.838 "strip_size_kb": 0, 00:23:59.838 "state": "online", 00:23:59.838 "raid_level": "raid1", 00:23:59.838 "superblock": true, 00:23:59.838 "num_base_bdevs": 2, 00:23:59.838 "num_base_bdevs_discovered": 2, 00:23:59.838 "num_base_bdevs_operational": 2, 00:23:59.838 "process": { 00:23:59.838 "type": "rebuild", 00:23:59.838 "target": "spare", 00:23:59.838 "progress": { 00:23:59.838 "blocks": 2816, 00:23:59.838 "percent": 35 00:23:59.838 } 00:23:59.838 }, 00:23:59.838 "base_bdevs_list": [ 00:23:59.838 { 00:23:59.838 "name": "spare", 00:23:59.838 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:23:59.838 "is_configured": true, 00:23:59.838 "data_offset": 256, 00:23:59.838 "data_size": 7936 00:23:59.838 }, 00:23:59.838 { 00:23:59.838 "name": "BaseBdev2", 00:23:59.838 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:23:59.838 "is_configured": true, 00:23:59.838 "data_offset": 256, 00:23:59.838 "data_size": 7936 00:23:59.838 } 00:23:59.838 ] 00:23:59.838 }' 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:59.838 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:00.099 [2024-06-10 10:19:21.736112] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:00.099 [2024-06-10 10:19:21.764463] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:00.099 [2024-06-10 10:19:21.764495] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.099 [2024-06-10 10:19:21.764504] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:00.099 [2024-06-10 10:19:21.764508] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.099 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.403 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.403 "name": "raid_bdev1", 00:24:00.403 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:00.403 "strip_size_kb": 0, 00:24:00.403 "state": "online", 00:24:00.403 "raid_level": "raid1", 00:24:00.403 "superblock": true, 00:24:00.403 "num_base_bdevs": 2, 00:24:00.403 "num_base_bdevs_discovered": 1, 00:24:00.403 "num_base_bdevs_operational": 1, 00:24:00.403 "base_bdevs_list": [ 00:24:00.403 { 00:24:00.403 "name": null, 00:24:00.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.403 "is_configured": false, 00:24:00.403 "data_offset": 256, 00:24:00.403 "data_size": 7936 00:24:00.403 }, 00:24:00.403 { 00:24:00.403 "name": "BaseBdev2", 00:24:00.403 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:00.403 "is_configured": true, 00:24:00.403 "data_offset": 256, 00:24:00.403 "data_size": 7936 00:24:00.403 } 00:24:00.403 ] 00:24:00.403 }' 00:24:00.403 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.403 10:19:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:00.679 10:19:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:00.938 [2024-06-10 10:19:22.646619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:00.938 [2024-06-10 10:19:22.646657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.938 [2024-06-10 10:19:22.646673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1991070 00:24:00.938 [2024-06-10 10:19:22.646680] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.938 [2024-06-10 10:19:22.646994] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.938 [2024-06-10 10:19:22.647006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:00.938 [2024-06-10 10:19:22.647066] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:00.938 [2024-06-10 10:19:22.647073] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:00.938 [2024-06-10 10:19:22.647078] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:00.938 [2024-06-10 10:19:22.647090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:00.938 [2024-06-10 10:19:22.650272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17e4360 00:24:00.938 [2024-06-10 10:19:22.651406] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:00.938 spare 00:24:00.938 10:19:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:01.877 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.878 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:02.138 "name": "raid_bdev1", 00:24:02.138 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:02.138 "strip_size_kb": 0, 00:24:02.138 "state": "online", 00:24:02.138 "raid_level": "raid1", 00:24:02.138 "superblock": true, 00:24:02.138 "num_base_bdevs": 2, 00:24:02.138 "num_base_bdevs_discovered": 2, 00:24:02.138 "num_base_bdevs_operational": 2, 00:24:02.138 "process": { 00:24:02.138 "type": "rebuild", 00:24:02.138 "target": "spare", 00:24:02.138 "progress": { 00:24:02.138 "blocks": 2816, 00:24:02.138 "percent": 35 00:24:02.138 } 00:24:02.138 }, 00:24:02.138 "base_bdevs_list": [ 00:24:02.138 { 00:24:02.138 "name": "spare", 00:24:02.138 "uuid": "87b4caae-7826-5ae2-ae25-9ca9ce7a4fe6", 00:24:02.138 "is_configured": true, 00:24:02.138 "data_offset": 256, 00:24:02.138 "data_size": 7936 00:24:02.138 }, 00:24:02.138 { 00:24:02.138 "name": "BaseBdev2", 00:24:02.138 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:02.138 "is_configured": true, 00:24:02.138 "data_offset": 256, 00:24:02.138 "data_size": 7936 00:24:02.138 } 00:24:02.138 ] 00:24:02.138 }' 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:02.138 10:19:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:02.398 [2024-06-10 10:19:24.079734] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:02.398 [2024-06-10 10:19:24.160220] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:02.398 [2024-06-10 10:19:24.160253] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:02.398 [2024-06-10 10:19:24.160262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:02.398 [2024-06-10 10:19:24.160267] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.398 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.658 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.658 "name": "raid_bdev1", 00:24:02.658 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:02.658 "strip_size_kb": 0, 00:24:02.658 "state": "online", 00:24:02.658 "raid_level": "raid1", 00:24:02.658 "superblock": true, 00:24:02.658 "num_base_bdevs": 2, 00:24:02.658 "num_base_bdevs_discovered": 1, 00:24:02.658 "num_base_bdevs_operational": 1, 00:24:02.658 "base_bdevs_list": [ 00:24:02.658 { 00:24:02.658 "name": null, 00:24:02.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.658 "is_configured": false, 00:24:02.658 "data_offset": 256, 00:24:02.658 "data_size": 7936 00:24:02.658 }, 00:24:02.658 { 00:24:02.658 "name": "BaseBdev2", 00:24:02.658 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:02.658 "is_configured": true, 00:24:02.658 "data_offset": 256, 00:24:02.658 "data_size": 7936 00:24:02.658 } 00:24:02.658 ] 00:24:02.658 }' 00:24:02.658 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.658 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.228 10:19:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.489 "name": "raid_bdev1", 00:24:03.489 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:03.489 "strip_size_kb": 0, 00:24:03.489 "state": "online", 00:24:03.489 "raid_level": "raid1", 00:24:03.489 "superblock": true, 00:24:03.489 "num_base_bdevs": 2, 00:24:03.489 "num_base_bdevs_discovered": 1, 00:24:03.489 "num_base_bdevs_operational": 1, 00:24:03.489 "base_bdevs_list": [ 00:24:03.489 { 00:24:03.489 "name": null, 00:24:03.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.489 "is_configured": false, 00:24:03.489 "data_offset": 256, 00:24:03.489 "data_size": 7936 00:24:03.489 }, 00:24:03.489 { 00:24:03.489 "name": "BaseBdev2", 00:24:03.489 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:03.489 "is_configured": true, 00:24:03.489 "data_offset": 256, 00:24:03.489 "data_size": 7936 00:24:03.489 } 00:24:03.489 ] 00:24:03.489 }' 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:03.489 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:03.749 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:03.749 [2024-06-10 10:19:25.587543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:03.749 [2024-06-10 10:19:25.587578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.749 [2024-06-10 10:19:25.587591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x198f750 00:24:03.749 [2024-06-10 10:19:25.587597] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.749 [2024-06-10 10:19:25.587883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.749 [2024-06-10 10:19:25.587894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:03.749 [2024-06-10 10:19:25.587941] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:03.749 [2024-06-10 10:19:25.587948] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:03.749 [2024-06-10 10:19:25.587953] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:03.749 BaseBdev1 00:24:04.009 10:19:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.949 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.210 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.210 "name": "raid_bdev1", 00:24:05.210 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:05.210 "strip_size_kb": 0, 00:24:05.210 "state": "online", 00:24:05.210 "raid_level": "raid1", 00:24:05.210 "superblock": true, 00:24:05.210 "num_base_bdevs": 2, 00:24:05.210 "num_base_bdevs_discovered": 1, 00:24:05.210 "num_base_bdevs_operational": 1, 00:24:05.210 "base_bdevs_list": [ 00:24:05.210 { 00:24:05.210 "name": null, 00:24:05.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.210 "is_configured": false, 00:24:05.210 "data_offset": 256, 00:24:05.210 "data_size": 7936 00:24:05.210 }, 00:24:05.210 { 00:24:05.210 "name": "BaseBdev2", 00:24:05.210 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:05.210 "is_configured": true, 00:24:05.210 "data_offset": 256, 00:24:05.210 "data_size": 7936 00:24:05.210 } 00:24:05.210 ] 00:24:05.210 }' 00:24:05.210 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.210 10:19:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.780 "name": "raid_bdev1", 00:24:05.780 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:05.780 "strip_size_kb": 0, 00:24:05.780 "state": "online", 00:24:05.780 "raid_level": "raid1", 00:24:05.780 "superblock": true, 00:24:05.780 "num_base_bdevs": 2, 00:24:05.780 "num_base_bdevs_discovered": 1, 00:24:05.780 "num_base_bdevs_operational": 1, 00:24:05.780 "base_bdevs_list": [ 00:24:05.780 { 00:24:05.780 "name": null, 00:24:05.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.780 "is_configured": false, 00:24:05.780 "data_offset": 256, 00:24:05.780 "data_size": 7936 00:24:05.780 }, 00:24:05.780 { 00:24:05.780 "name": "BaseBdev2", 00:24:05.780 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:05.780 "is_configured": true, 00:24:05.780 "data_offset": 256, 00:24:05.780 "data_size": 7936 00:24:05.780 } 00:24:05.780 ] 00:24:05.780 }' 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:05.780 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@649 -- # local es=0 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:06.041 [2024-06-10 10:19:27.829265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:06.041 [2024-06-10 10:19:27.829359] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:06.041 [2024-06-10 10:19:27.829367] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:06.041 request: 00:24:06.041 { 00:24:06.041 "raid_bdev": "raid_bdev1", 00:24:06.041 "base_bdev": "BaseBdev1", 00:24:06.041 "method": "bdev_raid_add_base_bdev", 00:24:06.041 "req_id": 1 00:24:06.041 } 00:24:06.041 Got JSON-RPC error response 00:24:06.041 response: 00:24:06.041 { 00:24:06.041 "code": -22, 00:24:06.041 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:06.041 } 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # es=1 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:06.041 10:19:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.423 10:19:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.423 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.423 "name": "raid_bdev1", 00:24:07.423 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:07.423 "strip_size_kb": 0, 00:24:07.423 "state": "online", 00:24:07.423 "raid_level": "raid1", 00:24:07.423 "superblock": true, 00:24:07.423 "num_base_bdevs": 2, 00:24:07.423 "num_base_bdevs_discovered": 1, 00:24:07.423 "num_base_bdevs_operational": 1, 00:24:07.423 "base_bdevs_list": [ 00:24:07.423 { 00:24:07.423 "name": null, 00:24:07.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.423 "is_configured": false, 00:24:07.423 "data_offset": 256, 00:24:07.423 "data_size": 7936 00:24:07.423 }, 00:24:07.423 { 00:24:07.423 "name": "BaseBdev2", 00:24:07.423 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:07.423 "is_configured": true, 00:24:07.423 "data_offset": 256, 00:24:07.423 "data_size": 7936 00:24:07.423 } 00:24:07.423 ] 00:24:07.423 }' 00:24:07.423 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.423 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.994 "name": "raid_bdev1", 00:24:07.994 "uuid": "69f58cbd-65b9-45d2-97ed-475202220b19", 00:24:07.994 "strip_size_kb": 0, 00:24:07.994 "state": "online", 00:24:07.994 "raid_level": "raid1", 00:24:07.994 "superblock": true, 00:24:07.994 "num_base_bdevs": 2, 00:24:07.994 "num_base_bdevs_discovered": 1, 00:24:07.994 "num_base_bdevs_operational": 1, 00:24:07.994 "base_bdevs_list": [ 00:24:07.994 { 00:24:07.994 "name": null, 00:24:07.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.994 "is_configured": false, 00:24:07.994 "data_offset": 256, 00:24:07.994 "data_size": 7936 00:24:07.994 }, 00:24:07.994 { 00:24:07.994 "name": "BaseBdev2", 00:24:07.994 "uuid": "9ab46ea2-3813-5a37-bc65-882c84c03a75", 00:24:07.994 "is_configured": true, 00:24:07.994 "data_offset": 256, 00:24:07.994 "data_size": 7936 00:24:07.994 } 00:24:07.994 ] 00:24:07.994 }' 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:07.994 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1109452 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 1109452 ']' 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 1109452 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1109452 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1109452' 00:24:08.256 killing process with pid 1109452 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # kill 1109452 00:24:08.256 Received shutdown signal, test time was about 60.000000 seconds 00:24:08.256 00:24:08.256 Latency(us) 00:24:08.256 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:08.256 =================================================================================================================== 00:24:08.256 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:08.256 [2024-06-10 10:19:29.937728] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:08.256 [2024-06-10 10:19:29.937800] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:08.256 [2024-06-10 10:19:29.937837] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:08.256 [2024-06-10 10:19:29.937846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e5cf0 name raid_bdev1, state offline 00:24:08.256 10:19:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@973 -- # wait 1109452 00:24:08.256 [2024-06-10 10:19:29.952698] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:08.256 10:19:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:24:08.256 00:24:08.256 real 0m26.684s 00:24:08.256 user 0m41.550s 00:24:08.256 sys 0m3.219s 00:24:08.256 10:19:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:08.256 10:19:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:08.256 ************************************ 00:24:08.256 END TEST raid_rebuild_test_sb_4k 00:24:08.256 ************************************ 00:24:08.256 10:19:30 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:24:08.256 10:19:30 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:24:08.256 10:19:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:24:08.256 10:19:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:08.256 10:19:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:08.517 ************************************ 00:24:08.517 START TEST raid_state_function_test_sb_md_separate 00:24:08.517 ************************************ 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1114228 00:24:08.517 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1114228' 00:24:08.517 Process raid pid: 1114228 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1114228 /var/tmp/spdk-raid.sock 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1114228 ']' 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:08.518 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:08.518 10:19:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:08.518 [2024-06-10 10:19:30.204644] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:24:08.518 [2024-06-10 10:19:30.204685] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:08.518 [2024-06-10 10:19:30.291137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:08.518 [2024-06-10 10:19:30.352932] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:08.778 [2024-06-10 10:19:30.398338] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:08.778 [2024-06-10 10:19:30.398359] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.350 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:09.350 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:24:09.350 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:09.350 [2024-06-10 10:19:31.213733] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:09.350 [2024-06-10 10:19:31.213763] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:09.350 [2024-06-10 10:19:31.213769] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:09.350 [2024-06-10 10:19:31.213775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.611 "name": "Existed_Raid", 00:24:09.611 "uuid": "fdac489d-2d6a-4097-b08c-88b3a6bc3000", 00:24:09.611 "strip_size_kb": 0, 00:24:09.611 "state": "configuring", 00:24:09.611 "raid_level": "raid1", 00:24:09.611 "superblock": true, 00:24:09.611 "num_base_bdevs": 2, 00:24:09.611 "num_base_bdevs_discovered": 0, 00:24:09.611 "num_base_bdevs_operational": 2, 00:24:09.611 "base_bdevs_list": [ 00:24:09.611 { 00:24:09.611 "name": "BaseBdev1", 00:24:09.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.611 "is_configured": false, 00:24:09.611 "data_offset": 0, 00:24:09.611 "data_size": 0 00:24:09.611 }, 00:24:09.611 { 00:24:09.611 "name": "BaseBdev2", 00:24:09.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.611 "is_configured": false, 00:24:09.611 "data_offset": 0, 00:24:09.611 "data_size": 0 00:24:09.611 } 00:24:09.611 ] 00:24:09.611 }' 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.611 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:10.182 10:19:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:10.449 [2024-06-10 10:19:32.095864] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:10.449 [2024-06-10 10:19:32.095880] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1635b00 name Existed_Raid, state configuring 00:24:10.450 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:10.450 [2024-06-10 10:19:32.288366] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:10.450 [2024-06-10 10:19:32.288386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:10.450 [2024-06-10 10:19:32.288392] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:10.450 [2024-06-10 10:19:32.288397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:10.450 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:24:10.717 [2024-06-10 10:19:32.475480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:10.717 BaseBdev1 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:10.717 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:10.978 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:11.239 [ 00:24:11.239 { 00:24:11.239 "name": "BaseBdev1", 00:24:11.239 "aliases": [ 00:24:11.239 "8dcf487f-559c-42cc-a300-cd18f4b6b128" 00:24:11.239 ], 00:24:11.239 "product_name": "Malloc disk", 00:24:11.239 "block_size": 4096, 00:24:11.239 "num_blocks": 8192, 00:24:11.239 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:11.239 "md_size": 32, 00:24:11.239 "md_interleave": false, 00:24:11.239 "dif_type": 0, 00:24:11.239 "assigned_rate_limits": { 00:24:11.239 "rw_ios_per_sec": 0, 00:24:11.239 "rw_mbytes_per_sec": 0, 00:24:11.239 "r_mbytes_per_sec": 0, 00:24:11.239 "w_mbytes_per_sec": 0 00:24:11.239 }, 00:24:11.239 "claimed": true, 00:24:11.239 "claim_type": "exclusive_write", 00:24:11.239 "zoned": false, 00:24:11.239 "supported_io_types": { 00:24:11.239 "read": true, 00:24:11.239 "write": true, 00:24:11.239 "unmap": true, 00:24:11.239 "write_zeroes": true, 00:24:11.239 "flush": true, 00:24:11.239 "reset": true, 00:24:11.239 "compare": false, 00:24:11.239 "compare_and_write": false, 00:24:11.239 "abort": true, 00:24:11.239 "nvme_admin": false, 00:24:11.239 "nvme_io": false 00:24:11.239 }, 00:24:11.239 "memory_domains": [ 00:24:11.239 { 00:24:11.239 "dma_device_id": "system", 00:24:11.239 "dma_device_type": 1 00:24:11.239 }, 00:24:11.239 { 00:24:11.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:11.239 "dma_device_type": 2 00:24:11.239 } 00:24:11.239 ], 00:24:11.239 "driver_specific": {} 00:24:11.239 } 00:24:11.239 ] 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.239 10:19:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:11.239 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.239 "name": "Existed_Raid", 00:24:11.239 "uuid": "b98057c5-9874-48a6-8a20-313363cf67e5", 00:24:11.239 "strip_size_kb": 0, 00:24:11.239 "state": "configuring", 00:24:11.239 "raid_level": "raid1", 00:24:11.239 "superblock": true, 00:24:11.239 "num_base_bdevs": 2, 00:24:11.239 "num_base_bdevs_discovered": 1, 00:24:11.239 "num_base_bdevs_operational": 2, 00:24:11.240 "base_bdevs_list": [ 00:24:11.240 { 00:24:11.240 "name": "BaseBdev1", 00:24:11.240 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:11.240 "is_configured": true, 00:24:11.240 "data_offset": 256, 00:24:11.240 "data_size": 7936 00:24:11.240 }, 00:24:11.240 { 00:24:11.240 "name": "BaseBdev2", 00:24:11.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.240 "is_configured": false, 00:24:11.240 "data_offset": 0, 00:24:11.240 "data_size": 0 00:24:11.240 } 00:24:11.240 ] 00:24:11.240 }' 00:24:11.240 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.240 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:11.811 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:12.072 [2024-06-10 10:19:33.718637] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:12.072 [2024-06-10 10:19:33.718663] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16353f0 name Existed_Raid, state configuring 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:12.072 [2024-06-10 10:19:33.911152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:12.072 [2024-06-10 10:19:33.912256] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:12.072 [2024-06-10 10:19:33.912280] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.072 10:19:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:12.332 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.332 "name": "Existed_Raid", 00:24:12.332 "uuid": "fed713b3-469c-4b6b-8bfb-0a309abe1adf", 00:24:12.332 "strip_size_kb": 0, 00:24:12.332 "state": "configuring", 00:24:12.332 "raid_level": "raid1", 00:24:12.333 "superblock": true, 00:24:12.333 "num_base_bdevs": 2, 00:24:12.333 "num_base_bdevs_discovered": 1, 00:24:12.333 "num_base_bdevs_operational": 2, 00:24:12.333 "base_bdevs_list": [ 00:24:12.333 { 00:24:12.333 "name": "BaseBdev1", 00:24:12.333 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:12.333 "is_configured": true, 00:24:12.333 "data_offset": 256, 00:24:12.333 "data_size": 7936 00:24:12.333 }, 00:24:12.333 { 00:24:12.333 "name": "BaseBdev2", 00:24:12.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.333 "is_configured": false, 00:24:12.333 "data_offset": 0, 00:24:12.333 "data_size": 0 00:24:12.333 } 00:24:12.333 ] 00:24:12.333 }' 00:24:12.333 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.333 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:12.904 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:24:13.164 [2024-06-10 10:19:34.826592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:13.164 [2024-06-10 10:19:34.826695] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16373d0 00:24:13.164 [2024-06-10 10:19:34.826702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:13.164 [2024-06-10 10:19:34.826744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1636e10 00:24:13.164 [2024-06-10 10:19:34.826816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16373d0 00:24:13.164 [2024-06-10 10:19:34.826831] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16373d0 00:24:13.165 [2024-06-10 10:19:34.826880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.165 BaseBdev2 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:13.165 10:19:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:13.165 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:13.426 [ 00:24:13.426 { 00:24:13.426 "name": "BaseBdev2", 00:24:13.426 "aliases": [ 00:24:13.426 "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5" 00:24:13.426 ], 00:24:13.426 "product_name": "Malloc disk", 00:24:13.426 "block_size": 4096, 00:24:13.426 "num_blocks": 8192, 00:24:13.426 "uuid": "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5", 00:24:13.426 "md_size": 32, 00:24:13.426 "md_interleave": false, 00:24:13.426 "dif_type": 0, 00:24:13.426 "assigned_rate_limits": { 00:24:13.426 "rw_ios_per_sec": 0, 00:24:13.426 "rw_mbytes_per_sec": 0, 00:24:13.426 "r_mbytes_per_sec": 0, 00:24:13.426 "w_mbytes_per_sec": 0 00:24:13.426 }, 00:24:13.426 "claimed": true, 00:24:13.426 "claim_type": "exclusive_write", 00:24:13.426 "zoned": false, 00:24:13.426 "supported_io_types": { 00:24:13.426 "read": true, 00:24:13.426 "write": true, 00:24:13.426 "unmap": true, 00:24:13.426 "write_zeroes": true, 00:24:13.426 "flush": true, 00:24:13.426 "reset": true, 00:24:13.426 "compare": false, 00:24:13.426 "compare_and_write": false, 00:24:13.426 "abort": true, 00:24:13.426 "nvme_admin": false, 00:24:13.426 "nvme_io": false 00:24:13.426 }, 00:24:13.426 "memory_domains": [ 00:24:13.426 { 00:24:13.426 "dma_device_id": "system", 00:24:13.426 "dma_device_type": 1 00:24:13.426 }, 00:24:13.426 { 00:24:13.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.426 "dma_device_type": 2 00:24:13.426 } 00:24:13.426 ], 00:24:13.426 "driver_specific": {} 00:24:13.426 } 00:24:13.426 ] 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.426 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:13.687 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.687 "name": "Existed_Raid", 00:24:13.687 "uuid": "fed713b3-469c-4b6b-8bfb-0a309abe1adf", 00:24:13.687 "strip_size_kb": 0, 00:24:13.687 "state": "online", 00:24:13.687 "raid_level": "raid1", 00:24:13.687 "superblock": true, 00:24:13.687 "num_base_bdevs": 2, 00:24:13.687 "num_base_bdevs_discovered": 2, 00:24:13.687 "num_base_bdevs_operational": 2, 00:24:13.687 "base_bdevs_list": [ 00:24:13.687 { 00:24:13.687 "name": "BaseBdev1", 00:24:13.687 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:13.687 "is_configured": true, 00:24:13.687 "data_offset": 256, 00:24:13.687 "data_size": 7936 00:24:13.687 }, 00:24:13.687 { 00:24:13.687 "name": "BaseBdev2", 00:24:13.687 "uuid": "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5", 00:24:13.687 "is_configured": true, 00:24:13.687 "data_offset": 256, 00:24:13.687 "data_size": 7936 00:24:13.687 } 00:24:13.687 ] 00:24:13.687 }' 00:24:13.687 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.687 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:14.258 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:14.258 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:14.258 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:14.259 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:14.259 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:14.259 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:14.259 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:14.259 10:19:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:14.259 [2024-06-10 10:19:36.041884] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:14.259 "name": "Existed_Raid", 00:24:14.259 "aliases": [ 00:24:14.259 "fed713b3-469c-4b6b-8bfb-0a309abe1adf" 00:24:14.259 ], 00:24:14.259 "product_name": "Raid Volume", 00:24:14.259 "block_size": 4096, 00:24:14.259 "num_blocks": 7936, 00:24:14.259 "uuid": "fed713b3-469c-4b6b-8bfb-0a309abe1adf", 00:24:14.259 "md_size": 32, 00:24:14.259 "md_interleave": false, 00:24:14.259 "dif_type": 0, 00:24:14.259 "assigned_rate_limits": { 00:24:14.259 "rw_ios_per_sec": 0, 00:24:14.259 "rw_mbytes_per_sec": 0, 00:24:14.259 "r_mbytes_per_sec": 0, 00:24:14.259 "w_mbytes_per_sec": 0 00:24:14.259 }, 00:24:14.259 "claimed": false, 00:24:14.259 "zoned": false, 00:24:14.259 "supported_io_types": { 00:24:14.259 "read": true, 00:24:14.259 "write": true, 00:24:14.259 "unmap": false, 00:24:14.259 "write_zeroes": true, 00:24:14.259 "flush": false, 00:24:14.259 "reset": true, 00:24:14.259 "compare": false, 00:24:14.259 "compare_and_write": false, 00:24:14.259 "abort": false, 00:24:14.259 "nvme_admin": false, 00:24:14.259 "nvme_io": false 00:24:14.259 }, 00:24:14.259 "memory_domains": [ 00:24:14.259 { 00:24:14.259 "dma_device_id": "system", 00:24:14.259 "dma_device_type": 1 00:24:14.259 }, 00:24:14.259 { 00:24:14.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.259 "dma_device_type": 2 00:24:14.259 }, 00:24:14.259 { 00:24:14.259 "dma_device_id": "system", 00:24:14.259 "dma_device_type": 1 00:24:14.259 }, 00:24:14.259 { 00:24:14.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.259 "dma_device_type": 2 00:24:14.259 } 00:24:14.259 ], 00:24:14.259 "driver_specific": { 00:24:14.259 "raid": { 00:24:14.259 "uuid": "fed713b3-469c-4b6b-8bfb-0a309abe1adf", 00:24:14.259 "strip_size_kb": 0, 00:24:14.259 "state": "online", 00:24:14.259 "raid_level": "raid1", 00:24:14.259 "superblock": true, 00:24:14.259 "num_base_bdevs": 2, 00:24:14.259 "num_base_bdevs_discovered": 2, 00:24:14.259 "num_base_bdevs_operational": 2, 00:24:14.259 "base_bdevs_list": [ 00:24:14.259 { 00:24:14.259 "name": "BaseBdev1", 00:24:14.259 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:14.259 "is_configured": true, 00:24:14.259 "data_offset": 256, 00:24:14.259 "data_size": 7936 00:24:14.259 }, 00:24:14.259 { 00:24:14.259 "name": "BaseBdev2", 00:24:14.259 "uuid": "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5", 00:24:14.259 "is_configured": true, 00:24:14.259 "data_offset": 256, 00:24:14.259 "data_size": 7936 00:24:14.259 } 00:24:14.259 ] 00:24:14.259 } 00:24:14.259 } 00:24:14.259 }' 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:14.259 BaseBdev2' 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:14.259 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:14.521 "name": "BaseBdev1", 00:24:14.521 "aliases": [ 00:24:14.521 "8dcf487f-559c-42cc-a300-cd18f4b6b128" 00:24:14.521 ], 00:24:14.521 "product_name": "Malloc disk", 00:24:14.521 "block_size": 4096, 00:24:14.521 "num_blocks": 8192, 00:24:14.521 "uuid": "8dcf487f-559c-42cc-a300-cd18f4b6b128", 00:24:14.521 "md_size": 32, 00:24:14.521 "md_interleave": false, 00:24:14.521 "dif_type": 0, 00:24:14.521 "assigned_rate_limits": { 00:24:14.521 "rw_ios_per_sec": 0, 00:24:14.521 "rw_mbytes_per_sec": 0, 00:24:14.521 "r_mbytes_per_sec": 0, 00:24:14.521 "w_mbytes_per_sec": 0 00:24:14.521 }, 00:24:14.521 "claimed": true, 00:24:14.521 "claim_type": "exclusive_write", 00:24:14.521 "zoned": false, 00:24:14.521 "supported_io_types": { 00:24:14.521 "read": true, 00:24:14.521 "write": true, 00:24:14.521 "unmap": true, 00:24:14.521 "write_zeroes": true, 00:24:14.521 "flush": true, 00:24:14.521 "reset": true, 00:24:14.521 "compare": false, 00:24:14.521 "compare_and_write": false, 00:24:14.521 "abort": true, 00:24:14.521 "nvme_admin": false, 00:24:14.521 "nvme_io": false 00:24:14.521 }, 00:24:14.521 "memory_domains": [ 00:24:14.521 { 00:24:14.521 "dma_device_id": "system", 00:24:14.521 "dma_device_type": 1 00:24:14.521 }, 00:24:14.521 { 00:24:14.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.521 "dma_device_type": 2 00:24:14.521 } 00:24:14.521 ], 00:24:14.521 "driver_specific": {} 00:24:14.521 }' 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.521 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:14.781 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:15.043 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:15.043 "name": "BaseBdev2", 00:24:15.043 "aliases": [ 00:24:15.043 "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5" 00:24:15.043 ], 00:24:15.043 "product_name": "Malloc disk", 00:24:15.043 "block_size": 4096, 00:24:15.043 "num_blocks": 8192, 00:24:15.043 "uuid": "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5", 00:24:15.043 "md_size": 32, 00:24:15.043 "md_interleave": false, 00:24:15.043 "dif_type": 0, 00:24:15.043 "assigned_rate_limits": { 00:24:15.043 "rw_ios_per_sec": 0, 00:24:15.043 "rw_mbytes_per_sec": 0, 00:24:15.043 "r_mbytes_per_sec": 0, 00:24:15.043 "w_mbytes_per_sec": 0 00:24:15.043 }, 00:24:15.043 "claimed": true, 00:24:15.043 "claim_type": "exclusive_write", 00:24:15.043 "zoned": false, 00:24:15.043 "supported_io_types": { 00:24:15.043 "read": true, 00:24:15.043 "write": true, 00:24:15.043 "unmap": true, 00:24:15.043 "write_zeroes": true, 00:24:15.043 "flush": true, 00:24:15.043 "reset": true, 00:24:15.043 "compare": false, 00:24:15.043 "compare_and_write": false, 00:24:15.043 "abort": true, 00:24:15.043 "nvme_admin": false, 00:24:15.043 "nvme_io": false 00:24:15.043 }, 00:24:15.043 "memory_domains": [ 00:24:15.043 { 00:24:15.043 "dma_device_id": "system", 00:24:15.043 "dma_device_type": 1 00:24:15.043 }, 00:24:15.043 { 00:24:15.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.043 "dma_device_type": 2 00:24:15.043 } 00:24:15.043 ], 00:24:15.043 "driver_specific": {} 00:24:15.043 }' 00:24:15.043 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.043 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.043 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:15.043 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.304 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.304 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:15.304 10:19:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.304 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.304 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:15.304 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.304 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:15.564 [2024-06-10 10:19:37.349033] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.564 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.565 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:15.825 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.825 "name": "Existed_Raid", 00:24:15.825 "uuid": "fed713b3-469c-4b6b-8bfb-0a309abe1adf", 00:24:15.825 "strip_size_kb": 0, 00:24:15.825 "state": "online", 00:24:15.825 "raid_level": "raid1", 00:24:15.825 "superblock": true, 00:24:15.825 "num_base_bdevs": 2, 00:24:15.825 "num_base_bdevs_discovered": 1, 00:24:15.825 "num_base_bdevs_operational": 1, 00:24:15.825 "base_bdevs_list": [ 00:24:15.825 { 00:24:15.825 "name": null, 00:24:15.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.825 "is_configured": false, 00:24:15.825 "data_offset": 256, 00:24:15.825 "data_size": 7936 00:24:15.825 }, 00:24:15.825 { 00:24:15.825 "name": "BaseBdev2", 00:24:15.825 "uuid": "64a9f8a9-3d27-46a0-b5aa-b2ce050a5cf5", 00:24:15.825 "is_configured": true, 00:24:15.825 "data_offset": 256, 00:24:15.825 "data_size": 7936 00:24:15.825 } 00:24:15.825 ] 00:24:15.825 }' 00:24:15.825 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.825 10:19:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:16.395 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:16.395 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:16.395 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.395 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:16.655 [2024-06-10 10:19:38.461420] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:16.655 [2024-06-10 10:19:38.461487] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:16.655 [2024-06-10 10:19:38.467941] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:16.655 [2024-06-10 10:19:38.467966] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:16.655 [2024-06-10 10:19:38.467972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16373d0 name Existed_Raid, state offline 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.655 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1114228 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1114228 ']' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 1114228 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1114228 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1114228' 00:24:16.915 killing process with pid 1114228 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 1114228 00:24:16.915 [2024-06-10 10:19:38.727142] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:16.915 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 1114228 00:24:16.915 [2024-06-10 10:19:38.727720] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:17.175 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:24:17.175 00:24:17.175 real 0m8.704s 00:24:17.175 user 0m15.735s 00:24:17.175 sys 0m1.410s 00:24:17.175 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:17.175 10:19:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:17.175 ************************************ 00:24:17.175 END TEST raid_state_function_test_sb_md_separate 00:24:17.175 ************************************ 00:24:17.175 10:19:38 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:24:17.175 10:19:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:24:17.175 10:19:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:17.175 10:19:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:17.175 ************************************ 00:24:17.175 START TEST raid_superblock_test_md_separate 00:24:17.175 ************************************ 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1115973 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1115973 /var/tmp/spdk-raid.sock 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1115973 ']' 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:17.175 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:17.175 10:19:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:17.175 [2024-06-10 10:19:38.979647] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:24:17.175 [2024-06-10 10:19:38.979690] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1115973 ] 00:24:17.435 [2024-06-10 10:19:39.065313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:17.435 [2024-06-10 10:19:39.127388] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:17.435 [2024-06-10 10:19:39.165970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:17.436 [2024-06-10 10:19:39.165993] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@863 -- # return 0 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:18.005 10:19:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:24:18.265 malloc1 00:24:18.265 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:18.526 [2024-06-10 10:19:40.184547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:18.526 [2024-06-10 10:19:40.184587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.526 [2024-06-10 10:19:40.184599] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22bb450 00:24:18.526 [2024-06-10 10:19:40.184606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.526 [2024-06-10 10:19:40.185780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.526 [2024-06-10 10:19:40.185799] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:18.526 pt1 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:18.526 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:24:18.526 malloc2 00:24:18.786 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:18.786 [2024-06-10 10:19:40.571929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:18.786 [2024-06-10 10:19:40.571958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.786 [2024-06-10 10:19:40.571966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2412210 00:24:18.786 [2024-06-10 10:19:40.571973] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.786 [2024-06-10 10:19:40.573104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.786 [2024-06-10 10:19:40.573121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:18.786 pt2 00:24:18.786 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:18.786 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:18.786 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:19.047 [2024-06-10 10:19:40.760410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:19.047 [2024-06-10 10:19:40.761484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:19.047 [2024-06-10 10:19:40.761593] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24137c0 00:24:19.047 [2024-06-10 10:19:40.761601] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:19.047 [2024-06-10 10:19:40.761650] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b9750 00:24:19.047 [2024-06-10 10:19:40.761736] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24137c0 00:24:19.047 [2024-06-10 10:19:40.761741] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24137c0 00:24:19.047 [2024-06-10 10:19:40.761789] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.047 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.307 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.307 "name": "raid_bdev1", 00:24:19.307 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:19.307 "strip_size_kb": 0, 00:24:19.307 "state": "online", 00:24:19.307 "raid_level": "raid1", 00:24:19.307 "superblock": true, 00:24:19.307 "num_base_bdevs": 2, 00:24:19.307 "num_base_bdevs_discovered": 2, 00:24:19.307 "num_base_bdevs_operational": 2, 00:24:19.307 "base_bdevs_list": [ 00:24:19.307 { 00:24:19.307 "name": "pt1", 00:24:19.307 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:19.307 "is_configured": true, 00:24:19.307 "data_offset": 256, 00:24:19.307 "data_size": 7936 00:24:19.307 }, 00:24:19.307 { 00:24:19.307 "name": "pt2", 00:24:19.307 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:19.307 "is_configured": true, 00:24:19.307 "data_offset": 256, 00:24:19.307 "data_size": 7936 00:24:19.307 } 00:24:19.307 ] 00:24:19.307 }' 00:24:19.307 10:19:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.307 10:19:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:19.877 [2024-06-10 10:19:41.682903] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:19.877 "name": "raid_bdev1", 00:24:19.877 "aliases": [ 00:24:19.877 "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44" 00:24:19.877 ], 00:24:19.877 "product_name": "Raid Volume", 00:24:19.877 "block_size": 4096, 00:24:19.877 "num_blocks": 7936, 00:24:19.877 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:19.877 "md_size": 32, 00:24:19.877 "md_interleave": false, 00:24:19.877 "dif_type": 0, 00:24:19.877 "assigned_rate_limits": { 00:24:19.877 "rw_ios_per_sec": 0, 00:24:19.877 "rw_mbytes_per_sec": 0, 00:24:19.877 "r_mbytes_per_sec": 0, 00:24:19.877 "w_mbytes_per_sec": 0 00:24:19.877 }, 00:24:19.877 "claimed": false, 00:24:19.877 "zoned": false, 00:24:19.877 "supported_io_types": { 00:24:19.877 "read": true, 00:24:19.877 "write": true, 00:24:19.877 "unmap": false, 00:24:19.877 "write_zeroes": true, 00:24:19.877 "flush": false, 00:24:19.877 "reset": true, 00:24:19.877 "compare": false, 00:24:19.877 "compare_and_write": false, 00:24:19.877 "abort": false, 00:24:19.877 "nvme_admin": false, 00:24:19.877 "nvme_io": false 00:24:19.877 }, 00:24:19.877 "memory_domains": [ 00:24:19.877 { 00:24:19.877 "dma_device_id": "system", 00:24:19.877 "dma_device_type": 1 00:24:19.877 }, 00:24:19.877 { 00:24:19.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.877 "dma_device_type": 2 00:24:19.877 }, 00:24:19.877 { 00:24:19.877 "dma_device_id": "system", 00:24:19.877 "dma_device_type": 1 00:24:19.877 }, 00:24:19.877 { 00:24:19.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.877 "dma_device_type": 2 00:24:19.877 } 00:24:19.877 ], 00:24:19.877 "driver_specific": { 00:24:19.877 "raid": { 00:24:19.877 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:19.877 "strip_size_kb": 0, 00:24:19.877 "state": "online", 00:24:19.877 "raid_level": "raid1", 00:24:19.877 "superblock": true, 00:24:19.877 "num_base_bdevs": 2, 00:24:19.877 "num_base_bdevs_discovered": 2, 00:24:19.877 "num_base_bdevs_operational": 2, 00:24:19.877 "base_bdevs_list": [ 00:24:19.877 { 00:24:19.877 "name": "pt1", 00:24:19.877 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:19.877 "is_configured": true, 00:24:19.877 "data_offset": 256, 00:24:19.877 "data_size": 7936 00:24:19.877 }, 00:24:19.877 { 00:24:19.877 "name": "pt2", 00:24:19.877 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:19.877 "is_configured": true, 00:24:19.877 "data_offset": 256, 00:24:19.877 "data_size": 7936 00:24:19.877 } 00:24:19.877 ] 00:24:19.877 } 00:24:19.877 } 00:24:19.877 }' 00:24:19.877 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:20.138 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:20.138 pt2' 00:24:20.138 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:20.138 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:20.138 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:20.138 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:20.138 "name": "pt1", 00:24:20.138 "aliases": [ 00:24:20.138 "00000000-0000-0000-0000-000000000001" 00:24:20.138 ], 00:24:20.138 "product_name": "passthru", 00:24:20.139 "block_size": 4096, 00:24:20.139 "num_blocks": 8192, 00:24:20.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:20.139 "md_size": 32, 00:24:20.139 "md_interleave": false, 00:24:20.139 "dif_type": 0, 00:24:20.139 "assigned_rate_limits": { 00:24:20.139 "rw_ios_per_sec": 0, 00:24:20.139 "rw_mbytes_per_sec": 0, 00:24:20.139 "r_mbytes_per_sec": 0, 00:24:20.139 "w_mbytes_per_sec": 0 00:24:20.139 }, 00:24:20.139 "claimed": true, 00:24:20.139 "claim_type": "exclusive_write", 00:24:20.139 "zoned": false, 00:24:20.139 "supported_io_types": { 00:24:20.139 "read": true, 00:24:20.139 "write": true, 00:24:20.139 "unmap": true, 00:24:20.139 "write_zeroes": true, 00:24:20.139 "flush": true, 00:24:20.139 "reset": true, 00:24:20.139 "compare": false, 00:24:20.139 "compare_and_write": false, 00:24:20.139 "abort": true, 00:24:20.139 "nvme_admin": false, 00:24:20.139 "nvme_io": false 00:24:20.139 }, 00:24:20.139 "memory_domains": [ 00:24:20.139 { 00:24:20.139 "dma_device_id": "system", 00:24:20.139 "dma_device_type": 1 00:24:20.139 }, 00:24:20.139 { 00:24:20.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.139 "dma_device_type": 2 00:24:20.139 } 00:24:20.139 ], 00:24:20.139 "driver_specific": { 00:24:20.139 "passthru": { 00:24:20.139 "name": "pt1", 00:24:20.139 "base_bdev_name": "malloc1" 00:24:20.139 } 00:24:20.139 } 00:24:20.139 }' 00:24:20.139 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.139 10:19:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.462 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:20.462 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:20.462 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:20.463 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:20.740 "name": "pt2", 00:24:20.740 "aliases": [ 00:24:20.740 "00000000-0000-0000-0000-000000000002" 00:24:20.740 ], 00:24:20.740 "product_name": "passthru", 00:24:20.740 "block_size": 4096, 00:24:20.740 "num_blocks": 8192, 00:24:20.740 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:20.740 "md_size": 32, 00:24:20.740 "md_interleave": false, 00:24:20.740 "dif_type": 0, 00:24:20.740 "assigned_rate_limits": { 00:24:20.740 "rw_ios_per_sec": 0, 00:24:20.740 "rw_mbytes_per_sec": 0, 00:24:20.740 "r_mbytes_per_sec": 0, 00:24:20.740 "w_mbytes_per_sec": 0 00:24:20.740 }, 00:24:20.740 "claimed": true, 00:24:20.740 "claim_type": "exclusive_write", 00:24:20.740 "zoned": false, 00:24:20.740 "supported_io_types": { 00:24:20.740 "read": true, 00:24:20.740 "write": true, 00:24:20.740 "unmap": true, 00:24:20.740 "write_zeroes": true, 00:24:20.740 "flush": true, 00:24:20.740 "reset": true, 00:24:20.740 "compare": false, 00:24:20.740 "compare_and_write": false, 00:24:20.740 "abort": true, 00:24:20.740 "nvme_admin": false, 00:24:20.740 "nvme_io": false 00:24:20.740 }, 00:24:20.740 "memory_domains": [ 00:24:20.740 { 00:24:20.740 "dma_device_id": "system", 00:24:20.740 "dma_device_type": 1 00:24:20.740 }, 00:24:20.740 { 00:24:20.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.740 "dma_device_type": 2 00:24:20.740 } 00:24:20.740 ], 00:24:20.740 "driver_specific": { 00:24:20.740 "passthru": { 00:24:20.740 "name": "pt2", 00:24:20.740 "base_bdev_name": "malloc2" 00:24:20.740 } 00:24:20.740 } 00:24:20.740 }' 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:20.740 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:21.001 10:19:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:24:21.261 [2024-06-10 10:19:42.998231] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:21.261 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 00:24:21.261 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 ']' 00:24:21.261 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:21.521 [2024-06-10 10:19:43.190541] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:21.521 [2024-06-10 10:19:43.190553] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:21.521 [2024-06-10 10:19:43.190593] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:21.521 [2024-06-10 10:19:43.190631] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:21.521 [2024-06-10 10:19:43.190637] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24137c0 name raid_bdev1, state offline 00:24:21.521 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.521 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:21.781 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:22.041 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:22.041 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:22.301 10:19:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:22.301 [2024-06-10 10:19:44.132890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:22.301 [2024-06-10 10:19:44.133954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:22.301 [2024-06-10 10:19:44.133995] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:22.301 [2024-06-10 10:19:44.134020] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:22.301 [2024-06-10 10:19:44.134030] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:22.301 [2024-06-10 10:19:44.134035] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22bae00 name raid_bdev1, state configuring 00:24:22.301 request: 00:24:22.301 { 00:24:22.301 "name": "raid_bdev1", 00:24:22.301 "raid_level": "raid1", 00:24:22.301 "base_bdevs": [ 00:24:22.301 "malloc1", 00:24:22.301 "malloc2" 00:24:22.301 ], 00:24:22.301 "superblock": false, 00:24:22.301 "method": "bdev_raid_create", 00:24:22.301 "req_id": 1 00:24:22.301 } 00:24:22.301 Got JSON-RPC error response 00:24:22.301 response: 00:24:22.301 { 00:24:22.301 "code": -17, 00:24:22.301 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:22.301 } 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # es=1 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.301 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:24:22.561 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:24:22.561 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:24:22.561 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:22.821 [2024-06-10 10:19:44.513812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:22.821 [2024-06-10 10:19:44.513838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.821 [2024-06-10 10:19:44.513847] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2413280 00:24:22.821 [2024-06-10 10:19:44.513853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.821 [2024-06-10 10:19:44.514961] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.821 [2024-06-10 10:19:44.514978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:22.821 [2024-06-10 10:19:44.515006] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:22.821 [2024-06-10 10:19:44.515021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:22.821 pt1 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.821 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.080 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.080 "name": "raid_bdev1", 00:24:23.080 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:23.080 "strip_size_kb": 0, 00:24:23.080 "state": "configuring", 00:24:23.080 "raid_level": "raid1", 00:24:23.080 "superblock": true, 00:24:23.080 "num_base_bdevs": 2, 00:24:23.080 "num_base_bdevs_discovered": 1, 00:24:23.080 "num_base_bdevs_operational": 2, 00:24:23.080 "base_bdevs_list": [ 00:24:23.080 { 00:24:23.080 "name": "pt1", 00:24:23.080 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:23.080 "is_configured": true, 00:24:23.080 "data_offset": 256, 00:24:23.080 "data_size": 7936 00:24:23.080 }, 00:24:23.080 { 00:24:23.080 "name": null, 00:24:23.080 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:23.080 "is_configured": false, 00:24:23.080 "data_offset": 256, 00:24:23.080 "data_size": 7936 00:24:23.080 } 00:24:23.080 ] 00:24:23.080 }' 00:24:23.080 10:19:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.080 10:19:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:23.652 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:24:23.652 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:24:23.652 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:23.652 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:23.652 [2024-06-10 10:19:45.388035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:23.652 [2024-06-10 10:19:45.388063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.652 [2024-06-10 10:19:45.388078] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23feb80 00:24:23.652 [2024-06-10 10:19:45.388085] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.652 [2024-06-10 10:19:45.388218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.652 [2024-06-10 10:19:45.388226] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:23.652 [2024-06-10 10:19:45.388253] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:23.652 [2024-06-10 10:19:45.388264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:23.652 [2024-06-10 10:19:45.388332] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23fef90 00:24:23.652 [2024-06-10 10:19:45.388338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:23.652 [2024-06-10 10:19:45.388374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23fc830 00:24:23.652 [2024-06-10 10:19:45.388451] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23fef90 00:24:23.652 [2024-06-10 10:19:45.388456] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23fef90 00:24:23.653 [2024-06-10 10:19:45.388505] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:23.653 pt2 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.653 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.913 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.913 "name": "raid_bdev1", 00:24:23.913 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:23.913 "strip_size_kb": 0, 00:24:23.913 "state": "online", 00:24:23.913 "raid_level": "raid1", 00:24:23.913 "superblock": true, 00:24:23.913 "num_base_bdevs": 2, 00:24:23.913 "num_base_bdevs_discovered": 2, 00:24:23.913 "num_base_bdevs_operational": 2, 00:24:23.913 "base_bdevs_list": [ 00:24:23.913 { 00:24:23.913 "name": "pt1", 00:24:23.913 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:23.913 "is_configured": true, 00:24:23.913 "data_offset": 256, 00:24:23.913 "data_size": 7936 00:24:23.913 }, 00:24:23.913 { 00:24:23.913 "name": "pt2", 00:24:23.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:23.913 "is_configured": true, 00:24:23.913 "data_offset": 256, 00:24:23.913 "data_size": 7936 00:24:23.913 } 00:24:23.913 ] 00:24:23.913 }' 00:24:23.913 10:19:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.913 10:19:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:24.484 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:24.485 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:24.485 [2024-06-10 10:19:46.334612] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:24.746 "name": "raid_bdev1", 00:24:24.746 "aliases": [ 00:24:24.746 "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44" 00:24:24.746 ], 00:24:24.746 "product_name": "Raid Volume", 00:24:24.746 "block_size": 4096, 00:24:24.746 "num_blocks": 7936, 00:24:24.746 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:24.746 "md_size": 32, 00:24:24.746 "md_interleave": false, 00:24:24.746 "dif_type": 0, 00:24:24.746 "assigned_rate_limits": { 00:24:24.746 "rw_ios_per_sec": 0, 00:24:24.746 "rw_mbytes_per_sec": 0, 00:24:24.746 "r_mbytes_per_sec": 0, 00:24:24.746 "w_mbytes_per_sec": 0 00:24:24.746 }, 00:24:24.746 "claimed": false, 00:24:24.746 "zoned": false, 00:24:24.746 "supported_io_types": { 00:24:24.746 "read": true, 00:24:24.746 "write": true, 00:24:24.746 "unmap": false, 00:24:24.746 "write_zeroes": true, 00:24:24.746 "flush": false, 00:24:24.746 "reset": true, 00:24:24.746 "compare": false, 00:24:24.746 "compare_and_write": false, 00:24:24.746 "abort": false, 00:24:24.746 "nvme_admin": false, 00:24:24.746 "nvme_io": false 00:24:24.746 }, 00:24:24.746 "memory_domains": [ 00:24:24.746 { 00:24:24.746 "dma_device_id": "system", 00:24:24.746 "dma_device_type": 1 00:24:24.746 }, 00:24:24.746 { 00:24:24.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:24.746 "dma_device_type": 2 00:24:24.746 }, 00:24:24.746 { 00:24:24.746 "dma_device_id": "system", 00:24:24.746 "dma_device_type": 1 00:24:24.746 }, 00:24:24.746 { 00:24:24.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:24.746 "dma_device_type": 2 00:24:24.746 } 00:24:24.746 ], 00:24:24.746 "driver_specific": { 00:24:24.746 "raid": { 00:24:24.746 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:24.746 "strip_size_kb": 0, 00:24:24.746 "state": "online", 00:24:24.746 "raid_level": "raid1", 00:24:24.746 "superblock": true, 00:24:24.746 "num_base_bdevs": 2, 00:24:24.746 "num_base_bdevs_discovered": 2, 00:24:24.746 "num_base_bdevs_operational": 2, 00:24:24.746 "base_bdevs_list": [ 00:24:24.746 { 00:24:24.746 "name": "pt1", 00:24:24.746 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:24.746 "is_configured": true, 00:24:24.746 "data_offset": 256, 00:24:24.746 "data_size": 7936 00:24:24.746 }, 00:24:24.746 { 00:24:24.746 "name": "pt2", 00:24:24.746 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:24.746 "is_configured": true, 00:24:24.746 "data_offset": 256, 00:24:24.746 "data_size": 7936 00:24:24.746 } 00:24:24.746 ] 00:24:24.746 } 00:24:24.746 } 00:24:24.746 }' 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:24.746 pt2' 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:24.746 "name": "pt1", 00:24:24.746 "aliases": [ 00:24:24.746 "00000000-0000-0000-0000-000000000001" 00:24:24.746 ], 00:24:24.746 "product_name": "passthru", 00:24:24.746 "block_size": 4096, 00:24:24.746 "num_blocks": 8192, 00:24:24.746 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:24.746 "md_size": 32, 00:24:24.746 "md_interleave": false, 00:24:24.746 "dif_type": 0, 00:24:24.746 "assigned_rate_limits": { 00:24:24.746 "rw_ios_per_sec": 0, 00:24:24.746 "rw_mbytes_per_sec": 0, 00:24:24.746 "r_mbytes_per_sec": 0, 00:24:24.746 "w_mbytes_per_sec": 0 00:24:24.746 }, 00:24:24.746 "claimed": true, 00:24:24.746 "claim_type": "exclusive_write", 00:24:24.746 "zoned": false, 00:24:24.746 "supported_io_types": { 00:24:24.746 "read": true, 00:24:24.746 "write": true, 00:24:24.746 "unmap": true, 00:24:24.746 "write_zeroes": true, 00:24:24.746 "flush": true, 00:24:24.746 "reset": true, 00:24:24.746 "compare": false, 00:24:24.746 "compare_and_write": false, 00:24:24.746 "abort": true, 00:24:24.746 "nvme_admin": false, 00:24:24.746 "nvme_io": false 00:24:24.746 }, 00:24:24.746 "memory_domains": [ 00:24:24.746 { 00:24:24.746 "dma_device_id": "system", 00:24:24.746 "dma_device_type": 1 00:24:24.746 }, 00:24:24.746 { 00:24:24.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:24.746 "dma_device_type": 2 00:24:24.746 } 00:24:24.746 ], 00:24:24.746 "driver_specific": { 00:24:24.746 "passthru": { 00:24:24.746 "name": "pt1", 00:24:24.746 "base_bdev_name": "malloc1" 00:24:24.746 } 00:24:24.746 } 00:24:24.746 }' 00:24:24.746 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:25.008 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:25.269 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:25.269 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:25.269 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:25.269 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:25.269 10:19:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:25.269 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:25.269 "name": "pt2", 00:24:25.269 "aliases": [ 00:24:25.269 "00000000-0000-0000-0000-000000000002" 00:24:25.269 ], 00:24:25.269 "product_name": "passthru", 00:24:25.269 "block_size": 4096, 00:24:25.269 "num_blocks": 8192, 00:24:25.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:25.269 "md_size": 32, 00:24:25.269 "md_interleave": false, 00:24:25.269 "dif_type": 0, 00:24:25.269 "assigned_rate_limits": { 00:24:25.269 "rw_ios_per_sec": 0, 00:24:25.269 "rw_mbytes_per_sec": 0, 00:24:25.269 "r_mbytes_per_sec": 0, 00:24:25.269 "w_mbytes_per_sec": 0 00:24:25.269 }, 00:24:25.269 "claimed": true, 00:24:25.269 "claim_type": "exclusive_write", 00:24:25.269 "zoned": false, 00:24:25.269 "supported_io_types": { 00:24:25.269 "read": true, 00:24:25.269 "write": true, 00:24:25.269 "unmap": true, 00:24:25.269 "write_zeroes": true, 00:24:25.269 "flush": true, 00:24:25.269 "reset": true, 00:24:25.269 "compare": false, 00:24:25.269 "compare_and_write": false, 00:24:25.269 "abort": true, 00:24:25.269 "nvme_admin": false, 00:24:25.269 "nvme_io": false 00:24:25.269 }, 00:24:25.269 "memory_domains": [ 00:24:25.269 { 00:24:25.269 "dma_device_id": "system", 00:24:25.269 "dma_device_type": 1 00:24:25.269 }, 00:24:25.269 { 00:24:25.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:25.269 "dma_device_type": 2 00:24:25.269 } 00:24:25.269 ], 00:24:25.269 "driver_specific": { 00:24:25.269 "passthru": { 00:24:25.269 "name": "pt2", 00:24:25.269 "base_bdev_name": "malloc2" 00:24:25.269 } 00:24:25.269 } 00:24:25.269 }' 00:24:25.269 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:25.530 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:24:25.792 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:25.792 [2024-06-10 10:19:47.645925] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 '!=' f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 ']' 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:26.053 [2024-06-10 10:19:47.842259] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.053 10:19:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.313 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.313 "name": "raid_bdev1", 00:24:26.313 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:26.313 "strip_size_kb": 0, 00:24:26.313 "state": "online", 00:24:26.313 "raid_level": "raid1", 00:24:26.313 "superblock": true, 00:24:26.313 "num_base_bdevs": 2, 00:24:26.313 "num_base_bdevs_discovered": 1, 00:24:26.313 "num_base_bdevs_operational": 1, 00:24:26.313 "base_bdevs_list": [ 00:24:26.313 { 00:24:26.313 "name": null, 00:24:26.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.313 "is_configured": false, 00:24:26.313 "data_offset": 256, 00:24:26.313 "data_size": 7936 00:24:26.313 }, 00:24:26.313 { 00:24:26.313 "name": "pt2", 00:24:26.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:26.313 "is_configured": true, 00:24:26.313 "data_offset": 256, 00:24:26.313 "data_size": 7936 00:24:26.313 } 00:24:26.313 ] 00:24:26.313 }' 00:24:26.313 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.313 10:19:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:26.883 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:26.883 [2024-06-10 10:19:48.740514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:26.883 [2024-06-10 10:19:48.740529] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:26.883 [2024-06-10 10:19:48.740564] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:26.883 [2024-06-10 10:19:48.740593] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:26.883 [2024-06-10 10:19:48.740599] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fef90 name raid_bdev1, state offline 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:27.143 10:19:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:24:27.403 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:27.664 [2024-06-10 10:19:49.317954] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:27.664 [2024-06-10 10:19:49.317983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:27.664 [2024-06-10 10:19:49.317993] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2413550 00:24:27.664 [2024-06-10 10:19:49.317999] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:27.664 [2024-06-10 10:19:49.319143] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:27.664 [2024-06-10 10:19:49.319160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:27.664 [2024-06-10 10:19:49.319191] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:27.664 [2024-06-10 10:19:49.319209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:27.664 [2024-06-10 10:19:49.319263] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b9360 00:24:27.664 [2024-06-10 10:19:49.319269] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:27.664 [2024-06-10 10:19:49.319308] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ff900 00:24:27.664 [2024-06-10 10:19:49.319382] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b9360 00:24:27.664 [2024-06-10 10:19:49.319386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b9360 00:24:27.664 [2024-06-10 10:19:49.319433] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:27.664 pt2 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.664 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.924 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:27.924 "name": "raid_bdev1", 00:24:27.924 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:27.924 "strip_size_kb": 0, 00:24:27.924 "state": "online", 00:24:27.924 "raid_level": "raid1", 00:24:27.924 "superblock": true, 00:24:27.924 "num_base_bdevs": 2, 00:24:27.924 "num_base_bdevs_discovered": 1, 00:24:27.924 "num_base_bdevs_operational": 1, 00:24:27.924 "base_bdevs_list": [ 00:24:27.924 { 00:24:27.924 "name": null, 00:24:27.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.924 "is_configured": false, 00:24:27.924 "data_offset": 256, 00:24:27.924 "data_size": 7936 00:24:27.924 }, 00:24:27.924 { 00:24:27.924 "name": "pt2", 00:24:27.924 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:27.924 "is_configured": true, 00:24:27.924 "data_offset": 256, 00:24:27.924 "data_size": 7936 00:24:27.924 } 00:24:27.924 ] 00:24:27.924 }' 00:24:27.924 10:19:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:27.924 10:19:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:28.494 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:28.494 [2024-06-10 10:19:50.244287] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:28.494 [2024-06-10 10:19:50.244307] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:28.494 [2024-06-10 10:19:50.244343] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:28.494 [2024-06-10 10:19:50.244373] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:28.494 [2024-06-10 10:19:50.244379] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b9360 name raid_bdev1, state offline 00:24:28.494 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.494 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:28.755 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:28.755 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:28.755 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:24:28.755 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:29.015 [2024-06-10 10:19:50.621223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:29.015 [2024-06-10 10:19:50.621248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.015 [2024-06-10 10:19:50.621258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23fe810 00:24:29.015 [2024-06-10 10:19:50.621264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.015 [2024-06-10 10:19:50.622394] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.015 [2024-06-10 10:19:50.622419] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:29.015 [2024-06-10 10:19:50.622451] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:29.015 [2024-06-10 10:19:50.622466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:29.015 [2024-06-10 10:19:50.622533] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:29.015 [2024-06-10 10:19:50.622539] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:29.015 [2024-06-10 10:19:50.622547] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2400980 name raid_bdev1, state configuring 00:24:29.015 [2024-06-10 10:19:50.622560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:29.015 [2024-06-10 10:19:50.622593] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23ff980 00:24:29.015 [2024-06-10 10:19:50.622599] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:29.015 [2024-06-10 10:19:50.622638] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2401520 00:24:29.015 [2024-06-10 10:19:50.622712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23ff980 00:24:29.015 [2024-06-10 10:19:50.622717] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23ff980 00:24:29.015 [2024-06-10 10:19:50.622769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.015 pt1 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.015 "name": "raid_bdev1", 00:24:29.015 "uuid": "f61bda17-5d82-4c33-ab2b-e4dfee8bbb44", 00:24:29.015 "strip_size_kb": 0, 00:24:29.015 "state": "online", 00:24:29.015 "raid_level": "raid1", 00:24:29.015 "superblock": true, 00:24:29.015 "num_base_bdevs": 2, 00:24:29.015 "num_base_bdevs_discovered": 1, 00:24:29.015 "num_base_bdevs_operational": 1, 00:24:29.015 "base_bdevs_list": [ 00:24:29.015 { 00:24:29.015 "name": null, 00:24:29.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.015 "is_configured": false, 00:24:29.015 "data_offset": 256, 00:24:29.015 "data_size": 7936 00:24:29.015 }, 00:24:29.015 { 00:24:29.015 "name": "pt2", 00:24:29.015 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:29.015 "is_configured": true, 00:24:29.015 "data_offset": 256, 00:24:29.015 "data_size": 7936 00:24:29.015 } 00:24:29.015 ] 00:24:29.015 }' 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.015 10:19:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:29.585 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:29.585 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:29.844 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:29.844 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:29.844 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:30.104 [2024-06-10 10:19:51.760257] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 '!=' f61bda17-5d82-4c33-ab2b-e4dfee8bbb44 ']' 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1115973 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1115973 ']' 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # kill -0 1115973 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # uname 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1115973 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1115973' 00:24:30.104 killing process with pid 1115973 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # kill 1115973 00:24:30.104 [2024-06-10 10:19:51.828130] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:30.104 [2024-06-10 10:19:51.828166] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:30.104 [2024-06-10 10:19:51.828196] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:30.104 [2024-06-10 10:19:51.828201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ff980 name raid_bdev1, state offline 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@973 -- # wait 1115973 00:24:30.104 [2024-06-10 10:19:51.840550] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:24:30.104 00:24:30.104 real 0m13.033s 00:24:30.104 user 0m24.084s 00:24:30.104 sys 0m2.002s 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:30.104 10:19:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.104 ************************************ 00:24:30.104 END TEST raid_superblock_test_md_separate 00:24:30.104 ************************************ 00:24:30.411 10:19:51 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:24:30.411 10:19:51 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:24:30.411 10:19:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:30.411 10:19:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:30.411 10:19:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:30.411 ************************************ 00:24:30.411 START TEST raid_rebuild_test_sb_md_separate 00:24:30.411 ************************************ 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1118436 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1118436 /var/tmp/spdk-raid.sock 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1118436 ']' 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:30.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.411 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:30.411 [2024-06-10 10:19:52.082930] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:24:30.411 [2024-06-10 10:19:52.082975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118436 ] 00:24:30.411 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:30.411 Zero copy mechanism will not be used. 00:24:30.411 [2024-06-10 10:19:52.169692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:30.411 [2024-06-10 10:19:52.234182] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:30.411 [2024-06-10 10:19:52.276052] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:30.411 [2024-06-10 10:19:52.276073] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:31.351 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:31.351 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:24:31.351 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:31.351 10:19:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:24:31.351 BaseBdev1_malloc 00:24:31.351 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:31.351 [2024-06-10 10:19:53.174163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:31.351 [2024-06-10 10:19:53.174199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.351 [2024-06-10 10:19:53.174213] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e5450 00:24:31.351 [2024-06-10 10:19:53.174219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.352 [2024-06-10 10:19:53.175353] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.352 [2024-06-10 10:19:53.175371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:31.352 BaseBdev1 00:24:31.352 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:31.352 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:24:31.612 BaseBdev2_malloc 00:24:31.612 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:31.872 [2024-06-10 10:19:53.549639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:31.872 [2024-06-10 10:19:53.549667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.872 [2024-06-10 10:19:53.549680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153c270 00:24:31.872 [2024-06-10 10:19:53.549687] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.872 [2024-06-10 10:19:53.550816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.872 [2024-06-10 10:19:53.550839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:31.872 BaseBdev2 00:24:31.872 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:24:31.872 spare_malloc 00:24:31.872 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:32.132 spare_delay 00:24:32.132 10:19:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:32.392 [2024-06-10 10:19:54.053377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:32.392 [2024-06-10 10:19:54.053406] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:32.392 [2024-06-10 10:19:54.053419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1528460 00:24:32.392 [2024-06-10 10:19:54.053430] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:32.392 [2024-06-10 10:19:54.054517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:32.392 [2024-06-10 10:19:54.054535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:32.392 spare 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:32.392 [2024-06-10 10:19:54.241872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:32.392 [2024-06-10 10:19:54.242871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:32.392 [2024-06-10 10:19:54.242995] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1529a60 00:24:32.392 [2024-06-10 10:19:54.243005] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:32.392 [2024-06-10 10:19:54.243056] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e35a0 00:24:32.392 [2024-06-10 10:19:54.243142] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1529a60 00:24:32.392 [2024-06-10 10:19:54.243148] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1529a60 00:24:32.392 [2024-06-10 10:19:54.243195] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:32.392 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.652 "name": "raid_bdev1", 00:24:32.652 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:32.652 "strip_size_kb": 0, 00:24:32.652 "state": "online", 00:24:32.652 "raid_level": "raid1", 00:24:32.652 "superblock": true, 00:24:32.652 "num_base_bdevs": 2, 00:24:32.652 "num_base_bdevs_discovered": 2, 00:24:32.652 "num_base_bdevs_operational": 2, 00:24:32.652 "base_bdevs_list": [ 00:24:32.652 { 00:24:32.652 "name": "BaseBdev1", 00:24:32.652 "uuid": "e92b0abf-9ec9-5e60-8ac8-035555a54a53", 00:24:32.652 "is_configured": true, 00:24:32.652 "data_offset": 256, 00:24:32.652 "data_size": 7936 00:24:32.652 }, 00:24:32.652 { 00:24:32.652 "name": "BaseBdev2", 00:24:32.652 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:32.652 "is_configured": true, 00:24:32.652 "data_offset": 256, 00:24:32.652 "data_size": 7936 00:24:32.652 } 00:24:32.652 ] 00:24:32.652 }' 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.652 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:33.222 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:33.222 10:19:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:33.482 [2024-06-10 10:19:55.132287] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:33.482 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:33.742 [2024-06-10 10:19:55.517112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152ab00 00:24:33.742 /dev/nbd0 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:33.742 1+0 records in 00:24:33.742 1+0 records out 00:24:33.742 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258296 s, 15.9 MB/s 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:33.742 10:19:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:34.312 7936+0 records in 00:24:34.312 7936+0 records out 00:24:34.312 32505856 bytes (33 MB, 31 MiB) copied, 0.52797 s, 61.6 MB/s 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:34.312 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:34.572 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:34.572 [2024-06-10 10:19:56.298994] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.832 [2024-06-10 10:19:56.471462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.832 "name": "raid_bdev1", 00:24:34.832 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:34.832 "strip_size_kb": 0, 00:24:34.832 "state": "online", 00:24:34.832 "raid_level": "raid1", 00:24:34.832 "superblock": true, 00:24:34.832 "num_base_bdevs": 2, 00:24:34.832 "num_base_bdevs_discovered": 1, 00:24:34.832 "num_base_bdevs_operational": 1, 00:24:34.832 "base_bdevs_list": [ 00:24:34.832 { 00:24:34.832 "name": null, 00:24:34.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.832 "is_configured": false, 00:24:34.832 "data_offset": 256, 00:24:34.832 "data_size": 7936 00:24:34.832 }, 00:24:34.832 { 00:24:34.832 "name": "BaseBdev2", 00:24:34.832 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:34.832 "is_configured": true, 00:24:34.832 "data_offset": 256, 00:24:34.832 "data_size": 7936 00:24:34.832 } 00:24:34.832 ] 00:24:34.832 }' 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.832 10:19:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:35.401 10:19:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:35.661 [2024-06-10 10:19:57.405832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:35.661 [2024-06-10 10:19:57.407399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152a8d0 00:24:35.661 [2024-06-10 10:19:57.408908] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:35.661 10:19:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.601 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.861 "name": "raid_bdev1", 00:24:36.861 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:36.861 "strip_size_kb": 0, 00:24:36.861 "state": "online", 00:24:36.861 "raid_level": "raid1", 00:24:36.861 "superblock": true, 00:24:36.861 "num_base_bdevs": 2, 00:24:36.861 "num_base_bdevs_discovered": 2, 00:24:36.861 "num_base_bdevs_operational": 2, 00:24:36.861 "process": { 00:24:36.861 "type": "rebuild", 00:24:36.861 "target": "spare", 00:24:36.861 "progress": { 00:24:36.861 "blocks": 2816, 00:24:36.861 "percent": 35 00:24:36.861 } 00:24:36.861 }, 00:24:36.861 "base_bdevs_list": [ 00:24:36.861 { 00:24:36.861 "name": "spare", 00:24:36.861 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:36.861 "is_configured": true, 00:24:36.861 "data_offset": 256, 00:24:36.861 "data_size": 7936 00:24:36.861 }, 00:24:36.861 { 00:24:36.861 "name": "BaseBdev2", 00:24:36.861 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:36.861 "is_configured": true, 00:24:36.861 "data_offset": 256, 00:24:36.861 "data_size": 7936 00:24:36.861 } 00:24:36.861 ] 00:24:36.861 }' 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:36.861 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:37.122 [2024-06-10 10:19:58.858056] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:37.122 [2024-06-10 10:19:58.917826] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:37.122 [2024-06-10 10:19:58.917859] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:37.122 [2024-06-10 10:19:58.917869] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:37.122 [2024-06-10 10:19:58.917873] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.122 10:19:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.382 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:37.382 "name": "raid_bdev1", 00:24:37.382 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:37.382 "strip_size_kb": 0, 00:24:37.382 "state": "online", 00:24:37.382 "raid_level": "raid1", 00:24:37.382 "superblock": true, 00:24:37.382 "num_base_bdevs": 2, 00:24:37.382 "num_base_bdevs_discovered": 1, 00:24:37.382 "num_base_bdevs_operational": 1, 00:24:37.382 "base_bdevs_list": [ 00:24:37.382 { 00:24:37.382 "name": null, 00:24:37.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.382 "is_configured": false, 00:24:37.382 "data_offset": 256, 00:24:37.382 "data_size": 7936 00:24:37.382 }, 00:24:37.382 { 00:24:37.382 "name": "BaseBdev2", 00:24:37.382 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:37.382 "is_configured": true, 00:24:37.382 "data_offset": 256, 00:24:37.382 "data_size": 7936 00:24:37.382 } 00:24:37.382 ] 00:24:37.382 }' 00:24:37.382 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:37.382 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.951 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.212 "name": "raid_bdev1", 00:24:38.212 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:38.212 "strip_size_kb": 0, 00:24:38.212 "state": "online", 00:24:38.212 "raid_level": "raid1", 00:24:38.212 "superblock": true, 00:24:38.212 "num_base_bdevs": 2, 00:24:38.212 "num_base_bdevs_discovered": 1, 00:24:38.212 "num_base_bdevs_operational": 1, 00:24:38.212 "base_bdevs_list": [ 00:24:38.212 { 00:24:38.212 "name": null, 00:24:38.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.212 "is_configured": false, 00:24:38.212 "data_offset": 256, 00:24:38.212 "data_size": 7936 00:24:38.212 }, 00:24:38.212 { 00:24:38.212 "name": "BaseBdev2", 00:24:38.212 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:38.212 "is_configured": true, 00:24:38.212 "data_offset": 256, 00:24:38.212 "data_size": 7936 00:24:38.212 } 00:24:38.212 ] 00:24:38.212 }' 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:38.212 10:19:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:38.472 [2024-06-10 10:20:00.110757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:38.472 [2024-06-10 10:20:00.112317] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152ced0 00:24:38.472 [2024-06-10 10:20:00.113441] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:38.472 10:20:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.410 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.716 "name": "raid_bdev1", 00:24:39.716 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:39.716 "strip_size_kb": 0, 00:24:39.716 "state": "online", 00:24:39.716 "raid_level": "raid1", 00:24:39.716 "superblock": true, 00:24:39.716 "num_base_bdevs": 2, 00:24:39.716 "num_base_bdevs_discovered": 2, 00:24:39.716 "num_base_bdevs_operational": 2, 00:24:39.716 "process": { 00:24:39.716 "type": "rebuild", 00:24:39.716 "target": "spare", 00:24:39.716 "progress": { 00:24:39.716 "blocks": 2816, 00:24:39.716 "percent": 35 00:24:39.716 } 00:24:39.716 }, 00:24:39.716 "base_bdevs_list": [ 00:24:39.716 { 00:24:39.716 "name": "spare", 00:24:39.716 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:39.716 "is_configured": true, 00:24:39.716 "data_offset": 256, 00:24:39.716 "data_size": 7936 00:24:39.716 }, 00:24:39.716 { 00:24:39.716 "name": "BaseBdev2", 00:24:39.716 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:39.716 "is_configured": true, 00:24:39.716 "data_offset": 256, 00:24:39.716 "data_size": 7936 00:24:39.716 } 00:24:39.716 ] 00:24:39.716 }' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:39.716 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=893 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.716 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.997 "name": "raid_bdev1", 00:24:39.997 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:39.997 "strip_size_kb": 0, 00:24:39.997 "state": "online", 00:24:39.997 "raid_level": "raid1", 00:24:39.997 "superblock": true, 00:24:39.997 "num_base_bdevs": 2, 00:24:39.997 "num_base_bdevs_discovered": 2, 00:24:39.997 "num_base_bdevs_operational": 2, 00:24:39.997 "process": { 00:24:39.997 "type": "rebuild", 00:24:39.997 "target": "spare", 00:24:39.997 "progress": { 00:24:39.997 "blocks": 3584, 00:24:39.997 "percent": 45 00:24:39.997 } 00:24:39.997 }, 00:24:39.997 "base_bdevs_list": [ 00:24:39.997 { 00:24:39.997 "name": "spare", 00:24:39.997 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:39.997 "is_configured": true, 00:24:39.997 "data_offset": 256, 00:24:39.997 "data_size": 7936 00:24:39.997 }, 00:24:39.997 { 00:24:39.997 "name": "BaseBdev2", 00:24:39.997 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:39.997 "is_configured": true, 00:24:39.997 "data_offset": 256, 00:24:39.997 "data_size": 7936 00:24:39.997 } 00:24:39.997 ] 00:24:39.997 }' 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.997 10:20:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.937 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.197 "name": "raid_bdev1", 00:24:41.197 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:41.197 "strip_size_kb": 0, 00:24:41.197 "state": "online", 00:24:41.197 "raid_level": "raid1", 00:24:41.197 "superblock": true, 00:24:41.197 "num_base_bdevs": 2, 00:24:41.197 "num_base_bdevs_discovered": 2, 00:24:41.197 "num_base_bdevs_operational": 2, 00:24:41.197 "process": { 00:24:41.197 "type": "rebuild", 00:24:41.197 "target": "spare", 00:24:41.197 "progress": { 00:24:41.197 "blocks": 6912, 00:24:41.197 "percent": 87 00:24:41.197 } 00:24:41.197 }, 00:24:41.197 "base_bdevs_list": [ 00:24:41.197 { 00:24:41.197 "name": "spare", 00:24:41.197 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:41.197 "is_configured": true, 00:24:41.197 "data_offset": 256, 00:24:41.197 "data_size": 7936 00:24:41.197 }, 00:24:41.197 { 00:24:41.197 "name": "BaseBdev2", 00:24:41.197 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:41.197 "is_configured": true, 00:24:41.197 "data_offset": 256, 00:24:41.197 "data_size": 7936 00:24:41.197 } 00:24:41.197 ] 00:24:41.197 }' 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.197 10:20:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:41.458 [2024-06-10 10:20:03.231813] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:41.458 [2024-06-10 10:20:03.231861] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:41.458 [2024-06-10 10:20:03.231924] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.401 10:20:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.401 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:42.401 "name": "raid_bdev1", 00:24:42.401 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:42.401 "strip_size_kb": 0, 00:24:42.401 "state": "online", 00:24:42.401 "raid_level": "raid1", 00:24:42.401 "superblock": true, 00:24:42.401 "num_base_bdevs": 2, 00:24:42.401 "num_base_bdevs_discovered": 2, 00:24:42.401 "num_base_bdevs_operational": 2, 00:24:42.401 "base_bdevs_list": [ 00:24:42.401 { 00:24:42.401 "name": "spare", 00:24:42.401 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:42.401 "is_configured": true, 00:24:42.401 "data_offset": 256, 00:24:42.401 "data_size": 7936 00:24:42.401 }, 00:24:42.401 { 00:24:42.401 "name": "BaseBdev2", 00:24:42.401 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:42.401 "is_configured": true, 00:24:42.401 "data_offset": 256, 00:24:42.401 "data_size": 7936 00:24:42.401 } 00:24:42.401 ] 00:24:42.401 }' 00:24:42.401 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.401 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:42.401 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:42.662 "name": "raid_bdev1", 00:24:42.662 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:42.662 "strip_size_kb": 0, 00:24:42.662 "state": "online", 00:24:42.662 "raid_level": "raid1", 00:24:42.662 "superblock": true, 00:24:42.662 "num_base_bdevs": 2, 00:24:42.662 "num_base_bdevs_discovered": 2, 00:24:42.662 "num_base_bdevs_operational": 2, 00:24:42.662 "base_bdevs_list": [ 00:24:42.662 { 00:24:42.662 "name": "spare", 00:24:42.662 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:42.662 "is_configured": true, 00:24:42.662 "data_offset": 256, 00:24:42.662 "data_size": 7936 00:24:42.662 }, 00:24:42.662 { 00:24:42.662 "name": "BaseBdev2", 00:24:42.662 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:42.662 "is_configured": true, 00:24:42.662 "data_offset": 256, 00:24:42.662 "data_size": 7936 00:24:42.662 } 00:24:42.662 ] 00:24:42.662 }' 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:42.662 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.922 "name": "raid_bdev1", 00:24:42.922 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:42.922 "strip_size_kb": 0, 00:24:42.922 "state": "online", 00:24:42.922 "raid_level": "raid1", 00:24:42.922 "superblock": true, 00:24:42.922 "num_base_bdevs": 2, 00:24:42.922 "num_base_bdevs_discovered": 2, 00:24:42.922 "num_base_bdevs_operational": 2, 00:24:42.922 "base_bdevs_list": [ 00:24:42.922 { 00:24:42.922 "name": "spare", 00:24:42.922 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:42.922 "is_configured": true, 00:24:42.922 "data_offset": 256, 00:24:42.922 "data_size": 7936 00:24:42.922 }, 00:24:42.922 { 00:24:42.922 "name": "BaseBdev2", 00:24:42.922 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:42.922 "is_configured": true, 00:24:42.922 "data_offset": 256, 00:24:42.922 "data_size": 7936 00:24:42.922 } 00:24:42.922 ] 00:24:42.922 }' 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.922 10:20:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:43.493 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:43.753 [2024-06-10 10:20:05.477702] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:43.753 [2024-06-10 10:20:05.477725] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:43.753 [2024-06-10 10:20:05.477766] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:43.753 [2024-06-10 10:20:05.477810] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:43.753 [2024-06-10 10:20:05.477816] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1529a60 name raid_bdev1, state offline 00:24:43.753 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.753 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:44.015 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:44.275 /dev/nbd0 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:44.275 1+0 records in 00:24:44.275 1+0 records out 00:24:44.275 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025322 s, 16.2 MB/s 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.275 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:44.276 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:44.276 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:44.276 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:44.276 10:20:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:44.276 /dev/nbd1 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:44.276 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:44.536 1+0 records in 00:24:44.536 1+0 records out 00:24:44.536 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271836 s, 15.1 MB/s 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:44.536 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:44.797 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:45.059 10:20:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:45.319 [2024-06-10 10:20:06.993768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:45.319 [2024-06-10 10:20:06.993804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.319 [2024-06-10 10:20:06.993817] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e4080 00:24:45.319 [2024-06-10 10:20:06.993828] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.319 [2024-06-10 10:20:06.995069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.319 [2024-06-10 10:20:06.995091] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:45.319 [2024-06-10 10:20:06.995133] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:45.319 [2024-06-10 10:20:06.995152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:45.319 [2024-06-10 10:20:06.995222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:45.319 spare 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:45.319 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:45.320 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:45.320 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:45.320 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:45.320 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.320 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.320 [2024-06-10 10:20:07.095506] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x157ecd0 00:24:45.320 [2024-06-10 10:20:07.095515] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:45.320 [2024-06-10 10:20:07.095565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152b080 00:24:45.320 [2024-06-10 10:20:07.095653] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x157ecd0 00:24:45.320 [2024-06-10 10:20:07.095658] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x157ecd0 00:24:45.320 [2024-06-10 10:20:07.095712] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:45.580 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.580 "name": "raid_bdev1", 00:24:45.580 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:45.580 "strip_size_kb": 0, 00:24:45.580 "state": "online", 00:24:45.580 "raid_level": "raid1", 00:24:45.580 "superblock": true, 00:24:45.580 "num_base_bdevs": 2, 00:24:45.580 "num_base_bdevs_discovered": 2, 00:24:45.580 "num_base_bdevs_operational": 2, 00:24:45.580 "base_bdevs_list": [ 00:24:45.580 { 00:24:45.580 "name": "spare", 00:24:45.580 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:45.580 "is_configured": true, 00:24:45.580 "data_offset": 256, 00:24:45.580 "data_size": 7936 00:24:45.580 }, 00:24:45.580 { 00:24:45.580 "name": "BaseBdev2", 00:24:45.580 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:45.580 "is_configured": true, 00:24:45.580 "data_offset": 256, 00:24:45.580 "data_size": 7936 00:24:45.580 } 00:24:45.580 ] 00:24:45.580 }' 00:24:45.580 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.580 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.164 "name": "raid_bdev1", 00:24:46.164 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:46.164 "strip_size_kb": 0, 00:24:46.164 "state": "online", 00:24:46.164 "raid_level": "raid1", 00:24:46.164 "superblock": true, 00:24:46.164 "num_base_bdevs": 2, 00:24:46.164 "num_base_bdevs_discovered": 2, 00:24:46.164 "num_base_bdevs_operational": 2, 00:24:46.164 "base_bdevs_list": [ 00:24:46.164 { 00:24:46.164 "name": "spare", 00:24:46.164 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:46.164 "is_configured": true, 00:24:46.164 "data_offset": 256, 00:24:46.164 "data_size": 7936 00:24:46.164 }, 00:24:46.164 { 00:24:46.164 "name": "BaseBdev2", 00:24:46.164 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:46.164 "is_configured": true, 00:24:46.164 "data_offset": 256, 00:24:46.164 "data_size": 7936 00:24:46.164 } 00:24:46.164 ] 00:24:46.164 }' 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.164 10:20:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:46.425 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.425 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:46.686 [2024-06-10 10:20:08.361311] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.686 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.948 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.948 "name": "raid_bdev1", 00:24:46.948 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:46.948 "strip_size_kb": 0, 00:24:46.948 "state": "online", 00:24:46.948 "raid_level": "raid1", 00:24:46.948 "superblock": true, 00:24:46.948 "num_base_bdevs": 2, 00:24:46.948 "num_base_bdevs_discovered": 1, 00:24:46.948 "num_base_bdevs_operational": 1, 00:24:46.948 "base_bdevs_list": [ 00:24:46.948 { 00:24:46.948 "name": null, 00:24:46.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.948 "is_configured": false, 00:24:46.948 "data_offset": 256, 00:24:46.948 "data_size": 7936 00:24:46.948 }, 00:24:46.948 { 00:24:46.948 "name": "BaseBdev2", 00:24:46.948 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:46.948 "is_configured": true, 00:24:46.948 "data_offset": 256, 00:24:46.948 "data_size": 7936 00:24:46.948 } 00:24:46.948 ] 00:24:46.948 }' 00:24:46.948 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.948 10:20:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:47.520 10:20:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:47.520 [2024-06-10 10:20:09.295673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:47.520 [2024-06-10 10:20:09.295780] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:47.520 [2024-06-10 10:20:09.295788] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:47.520 [2024-06-10 10:20:09.295806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:47.520 [2024-06-10 10:20:09.297336] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152c7a0 00:24:47.520 [2024-06-10 10:20:09.298799] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:47.520 10:20:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.464 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.724 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.724 "name": "raid_bdev1", 00:24:48.724 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:48.724 "strip_size_kb": 0, 00:24:48.724 "state": "online", 00:24:48.724 "raid_level": "raid1", 00:24:48.724 "superblock": true, 00:24:48.724 "num_base_bdevs": 2, 00:24:48.724 "num_base_bdevs_discovered": 2, 00:24:48.724 "num_base_bdevs_operational": 2, 00:24:48.724 "process": { 00:24:48.724 "type": "rebuild", 00:24:48.724 "target": "spare", 00:24:48.724 "progress": { 00:24:48.724 "blocks": 2816, 00:24:48.724 "percent": 35 00:24:48.724 } 00:24:48.724 }, 00:24:48.724 "base_bdevs_list": [ 00:24:48.724 { 00:24:48.724 "name": "spare", 00:24:48.724 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:48.724 "is_configured": true, 00:24:48.724 "data_offset": 256, 00:24:48.724 "data_size": 7936 00:24:48.724 }, 00:24:48.724 { 00:24:48.724 "name": "BaseBdev2", 00:24:48.724 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:48.724 "is_configured": true, 00:24:48.724 "data_offset": 256, 00:24:48.724 "data_size": 7936 00:24:48.724 } 00:24:48.724 ] 00:24:48.724 }' 00:24:48.724 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.724 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.725 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.725 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.725 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:48.986 [2024-06-10 10:20:10.724310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:48.986 [2024-06-10 10:20:10.807773] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:48.986 [2024-06-10 10:20:10.807806] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:48.986 [2024-06-10 10:20:10.807815] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:48.986 [2024-06-10 10:20:10.807819] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.986 10:20:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.247 10:20:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.247 "name": "raid_bdev1", 00:24:49.247 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:49.247 "strip_size_kb": 0, 00:24:49.247 "state": "online", 00:24:49.247 "raid_level": "raid1", 00:24:49.247 "superblock": true, 00:24:49.247 "num_base_bdevs": 2, 00:24:49.247 "num_base_bdevs_discovered": 1, 00:24:49.247 "num_base_bdevs_operational": 1, 00:24:49.247 "base_bdevs_list": [ 00:24:49.247 { 00:24:49.247 "name": null, 00:24:49.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.247 "is_configured": false, 00:24:49.247 "data_offset": 256, 00:24:49.247 "data_size": 7936 00:24:49.247 }, 00:24:49.247 { 00:24:49.247 "name": "BaseBdev2", 00:24:49.247 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:49.247 "is_configured": true, 00:24:49.247 "data_offset": 256, 00:24:49.247 "data_size": 7936 00:24:49.247 } 00:24:49.247 ] 00:24:49.247 }' 00:24:49.247 10:20:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.247 10:20:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:49.819 10:20:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:50.080 [2024-06-10 10:20:11.744063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:50.080 [2024-06-10 10:20:11.744096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.080 [2024-06-10 10:20:11.744110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x157ef50 00:24:50.080 [2024-06-10 10:20:11.744117] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.080 [2024-06-10 10:20:11.744289] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.080 [2024-06-10 10:20:11.744299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:50.080 [2024-06-10 10:20:11.744340] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:50.080 [2024-06-10 10:20:11.744346] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:50.080 [2024-06-10 10:20:11.744352] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:50.080 [2024-06-10 10:20:11.744363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:50.080 [2024-06-10 10:20:11.745911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1529d40 00:24:50.080 [2024-06-10 10:20:11.747059] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:50.080 spare 00:24:50.080 10:20:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.021 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.282 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.282 "name": "raid_bdev1", 00:24:51.282 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:51.282 "strip_size_kb": 0, 00:24:51.282 "state": "online", 00:24:51.282 "raid_level": "raid1", 00:24:51.282 "superblock": true, 00:24:51.282 "num_base_bdevs": 2, 00:24:51.282 "num_base_bdevs_discovered": 2, 00:24:51.282 "num_base_bdevs_operational": 2, 00:24:51.282 "process": { 00:24:51.282 "type": "rebuild", 00:24:51.282 "target": "spare", 00:24:51.282 "progress": { 00:24:51.282 "blocks": 2816, 00:24:51.282 "percent": 35 00:24:51.282 } 00:24:51.282 }, 00:24:51.282 "base_bdevs_list": [ 00:24:51.282 { 00:24:51.282 "name": "spare", 00:24:51.282 "uuid": "903474e4-c125-5986-835c-0bc363db9996", 00:24:51.282 "is_configured": true, 00:24:51.282 "data_offset": 256, 00:24:51.282 "data_size": 7936 00:24:51.282 }, 00:24:51.282 { 00:24:51.282 "name": "BaseBdev2", 00:24:51.282 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:51.282 "is_configured": true, 00:24:51.282 "data_offset": 256, 00:24:51.282 "data_size": 7936 00:24:51.282 } 00:24:51.282 ] 00:24:51.282 }' 00:24:51.283 10:20:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.283 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:51.283 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.283 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:51.283 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:51.543 [2024-06-10 10:20:13.216321] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.543 [2024-06-10 10:20:13.255999] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:51.543 [2024-06-10 10:20:13.256028] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.543 [2024-06-10 10:20:13.256038] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.543 [2024-06-10 10:20:13.256042] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:51.543 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:51.543 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.544 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.804 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.804 "name": "raid_bdev1", 00:24:51.804 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:51.804 "strip_size_kb": 0, 00:24:51.804 "state": "online", 00:24:51.804 "raid_level": "raid1", 00:24:51.804 "superblock": true, 00:24:51.804 "num_base_bdevs": 2, 00:24:51.804 "num_base_bdevs_discovered": 1, 00:24:51.804 "num_base_bdevs_operational": 1, 00:24:51.804 "base_bdevs_list": [ 00:24:51.804 { 00:24:51.804 "name": null, 00:24:51.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.804 "is_configured": false, 00:24:51.804 "data_offset": 256, 00:24:51.804 "data_size": 7936 00:24:51.804 }, 00:24:51.804 { 00:24:51.804 "name": "BaseBdev2", 00:24:51.804 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:51.804 "is_configured": true, 00:24:51.804 "data_offset": 256, 00:24:51.804 "data_size": 7936 00:24:51.804 } 00:24:51.804 ] 00:24:51.804 }' 00:24:51.804 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.804 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.376 10:20:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:52.376 "name": "raid_bdev1", 00:24:52.376 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:52.376 "strip_size_kb": 0, 00:24:52.376 "state": "online", 00:24:52.376 "raid_level": "raid1", 00:24:52.376 "superblock": true, 00:24:52.376 "num_base_bdevs": 2, 00:24:52.376 "num_base_bdevs_discovered": 1, 00:24:52.376 "num_base_bdevs_operational": 1, 00:24:52.376 "base_bdevs_list": [ 00:24:52.376 { 00:24:52.376 "name": null, 00:24:52.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.376 "is_configured": false, 00:24:52.376 "data_offset": 256, 00:24:52.376 "data_size": 7936 00:24:52.376 }, 00:24:52.376 { 00:24:52.376 "name": "BaseBdev2", 00:24:52.376 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:52.376 "is_configured": true, 00:24:52.376 "data_offset": 256, 00:24:52.376 "data_size": 7936 00:24:52.376 } 00:24:52.376 ] 00:24:52.376 }' 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:52.376 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:52.637 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:52.899 [2024-06-10 10:20:14.577053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:52.899 [2024-06-10 10:20:14.577083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.899 [2024-06-10 10:20:14.577095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e5680 00:24:52.899 [2024-06-10 10:20:14.577101] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.899 [2024-06-10 10:20:14.577245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.899 [2024-06-10 10:20:14.577255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:52.899 [2024-06-10 10:20:14.577286] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:52.899 [2024-06-10 10:20:14.577292] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:52.899 [2024-06-10 10:20:14.577297] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:52.899 BaseBdev1 00:24:52.899 10:20:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.841 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.103 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.103 "name": "raid_bdev1", 00:24:54.103 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:54.103 "strip_size_kb": 0, 00:24:54.103 "state": "online", 00:24:54.103 "raid_level": "raid1", 00:24:54.103 "superblock": true, 00:24:54.103 "num_base_bdevs": 2, 00:24:54.103 "num_base_bdevs_discovered": 1, 00:24:54.103 "num_base_bdevs_operational": 1, 00:24:54.103 "base_bdevs_list": [ 00:24:54.103 { 00:24:54.103 "name": null, 00:24:54.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.103 "is_configured": false, 00:24:54.103 "data_offset": 256, 00:24:54.103 "data_size": 7936 00:24:54.103 }, 00:24:54.103 { 00:24:54.103 "name": "BaseBdev2", 00:24:54.103 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:54.103 "is_configured": true, 00:24:54.103 "data_offset": 256, 00:24:54.103 "data_size": 7936 00:24:54.103 } 00:24:54.103 ] 00:24:54.103 }' 00:24:54.103 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.103 10:20:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:54.674 "name": "raid_bdev1", 00:24:54.674 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:54.674 "strip_size_kb": 0, 00:24:54.674 "state": "online", 00:24:54.674 "raid_level": "raid1", 00:24:54.674 "superblock": true, 00:24:54.674 "num_base_bdevs": 2, 00:24:54.674 "num_base_bdevs_discovered": 1, 00:24:54.674 "num_base_bdevs_operational": 1, 00:24:54.674 "base_bdevs_list": [ 00:24:54.674 { 00:24:54.674 "name": null, 00:24:54.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.674 "is_configured": false, 00:24:54.674 "data_offset": 256, 00:24:54.674 "data_size": 7936 00:24:54.674 }, 00:24:54.674 { 00:24:54.674 "name": "BaseBdev2", 00:24:54.674 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:54.674 "is_configured": true, 00:24:54.674 "data_offset": 256, 00:24:54.674 "data_size": 7936 00:24:54.674 } 00:24:54.674 ] 00:24:54.674 }' 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:54.674 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:54.933 [2024-06-10 10:20:16.742532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:54.933 [2024-06-10 10:20:16.742622] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:54.933 [2024-06-10 10:20:16.742630] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:54.933 request: 00:24:54.933 { 00:24:54.933 "raid_bdev": "raid_bdev1", 00:24:54.933 "base_bdev": "BaseBdev1", 00:24:54.933 "method": "bdev_raid_add_base_bdev", 00:24:54.933 "req_id": 1 00:24:54.933 } 00:24:54.933 Got JSON-RPC error response 00:24:54.933 response: 00:24:54.933 { 00:24:54.933 "code": -22, 00:24:54.933 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:54.933 } 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # es=1 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:54.933 10:20:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.316 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.316 "name": "raid_bdev1", 00:24:56.316 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:56.316 "strip_size_kb": 0, 00:24:56.316 "state": "online", 00:24:56.316 "raid_level": "raid1", 00:24:56.316 "superblock": true, 00:24:56.316 "num_base_bdevs": 2, 00:24:56.316 "num_base_bdevs_discovered": 1, 00:24:56.316 "num_base_bdevs_operational": 1, 00:24:56.316 "base_bdevs_list": [ 00:24:56.316 { 00:24:56.316 "name": null, 00:24:56.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.316 "is_configured": false, 00:24:56.316 "data_offset": 256, 00:24:56.316 "data_size": 7936 00:24:56.317 }, 00:24:56.317 { 00:24:56.317 "name": "BaseBdev2", 00:24:56.317 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:56.317 "is_configured": true, 00:24:56.317 "data_offset": 256, 00:24:56.317 "data_size": 7936 00:24:56.317 } 00:24:56.317 ] 00:24:56.317 }' 00:24:56.317 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.317 10:20:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.886 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.886 "name": "raid_bdev1", 00:24:56.886 "uuid": "69e0929b-2889-4a3c-b49c-8b5da1420f25", 00:24:56.886 "strip_size_kb": 0, 00:24:56.886 "state": "online", 00:24:56.887 "raid_level": "raid1", 00:24:56.887 "superblock": true, 00:24:56.887 "num_base_bdevs": 2, 00:24:56.887 "num_base_bdevs_discovered": 1, 00:24:56.887 "num_base_bdevs_operational": 1, 00:24:56.887 "base_bdevs_list": [ 00:24:56.887 { 00:24:56.887 "name": null, 00:24:56.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.887 "is_configured": false, 00:24:56.887 "data_offset": 256, 00:24:56.887 "data_size": 7936 00:24:56.887 }, 00:24:56.887 { 00:24:56.887 "name": "BaseBdev2", 00:24:56.887 "uuid": "bff47ffd-3101-5eb7-8a61-21f6300b968a", 00:24:56.887 "is_configured": true, 00:24:56.887 "data_offset": 256, 00:24:56.887 "data_size": 7936 00:24:56.887 } 00:24:56.887 ] 00:24:56.887 }' 00:24:56.887 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.887 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:56.887 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1118436 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1118436 ']' 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 1118436 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1118436 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1118436' 00:24:57.147 killing process with pid 1118436 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 1118436 00:24:57.147 Received shutdown signal, test time was about 60.000000 seconds 00:24:57.147 00:24:57.147 Latency(us) 00:24:57.147 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:57.147 =================================================================================================================== 00:24:57.147 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:57.147 [2024-06-10 10:20:18.822663] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:57.147 [2024-06-10 10:20:18.822726] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:57.147 [2024-06-10 10:20:18.822759] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:57.147 [2024-06-10 10:20:18.822766] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x157ecd0 name raid_bdev1, state offline 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 1118436 00:24:57.147 [2024-06-10 10:20:18.841123] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:24:57.147 00:24:57.147 real 0m26.933s 00:24:57.147 user 0m42.193s 00:24:57.147 sys 0m3.279s 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:57.147 10:20:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:57.147 ************************************ 00:24:57.147 END TEST raid_rebuild_test_sb_md_separate 00:24:57.147 ************************************ 00:24:57.147 10:20:18 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:24:57.147 10:20:18 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:24:57.147 10:20:19 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:24:57.147 10:20:19 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:57.147 10:20:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:57.408 ************************************ 00:24:57.408 START TEST raid_state_function_test_sb_md_interleaved 00:24:57.408 ************************************ 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1123454 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1123454' 00:24:57.408 Process raid pid: 1123454 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1123454 /var/tmp/spdk-raid.sock 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1123454 ']' 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:57.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:57.408 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:57.408 [2024-06-10 10:20:19.100345] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:24:57.408 [2024-06-10 10:20:19.100390] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:57.408 [2024-06-10 10:20:19.189126] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:57.408 [2024-06-10 10:20:19.254735] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:57.669 [2024-06-10 10:20:19.299322] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:57.669 [2024-06-10 10:20:19.299345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:58.240 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:58.240 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:24:58.240 10:20:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:58.501 [2024-06-10 10:20:20.106789] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:58.502 [2024-06-10 10:20:20.106827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:58.502 [2024-06-10 10:20:20.106833] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:58.502 [2024-06-10 10:20:20.106839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.502 "name": "Existed_Raid", 00:24:58.502 "uuid": "84848504-7a73-4f72-9603-f2d45a79de04", 00:24:58.502 "strip_size_kb": 0, 00:24:58.502 "state": "configuring", 00:24:58.502 "raid_level": "raid1", 00:24:58.502 "superblock": true, 00:24:58.502 "num_base_bdevs": 2, 00:24:58.502 "num_base_bdevs_discovered": 0, 00:24:58.502 "num_base_bdevs_operational": 2, 00:24:58.502 "base_bdevs_list": [ 00:24:58.502 { 00:24:58.502 "name": "BaseBdev1", 00:24:58.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.502 "is_configured": false, 00:24:58.502 "data_offset": 0, 00:24:58.502 "data_size": 0 00:24:58.502 }, 00:24:58.502 { 00:24:58.502 "name": "BaseBdev2", 00:24:58.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.502 "is_configured": false, 00:24:58.502 "data_offset": 0, 00:24:58.502 "data_size": 0 00:24:58.502 } 00:24:58.502 ] 00:24:58.502 }' 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.502 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:59.073 10:20:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:59.334 [2024-06-10 10:20:21.000931] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:59.334 [2024-06-10 10:20:21.000946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbeb00 name Existed_Raid, state configuring 00:24:59.334 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:59.334 [2024-06-10 10:20:21.173388] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:59.334 [2024-06-10 10:20:21.173405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:59.334 [2024-06-10 10:20:21.173413] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:59.334 [2024-06-10 10:20:21.173419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:59.334 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:24:59.597 [2024-06-10 10:20:21.368597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:59.597 BaseBdev1 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:59.597 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:59.904 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:59.905 [ 00:24:59.905 { 00:24:59.905 "name": "BaseBdev1", 00:24:59.905 "aliases": [ 00:24:59.905 "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc" 00:24:59.905 ], 00:24:59.905 "product_name": "Malloc disk", 00:24:59.905 "block_size": 4128, 00:24:59.905 "num_blocks": 8192, 00:24:59.905 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:24:59.905 "md_size": 32, 00:24:59.905 "md_interleave": true, 00:24:59.905 "dif_type": 0, 00:24:59.905 "assigned_rate_limits": { 00:24:59.905 "rw_ios_per_sec": 0, 00:24:59.905 "rw_mbytes_per_sec": 0, 00:24:59.905 "r_mbytes_per_sec": 0, 00:24:59.905 "w_mbytes_per_sec": 0 00:24:59.905 }, 00:24:59.905 "claimed": true, 00:24:59.905 "claim_type": "exclusive_write", 00:24:59.905 "zoned": false, 00:24:59.905 "supported_io_types": { 00:24:59.905 "read": true, 00:24:59.905 "write": true, 00:24:59.905 "unmap": true, 00:24:59.905 "write_zeroes": true, 00:24:59.905 "flush": true, 00:24:59.905 "reset": true, 00:24:59.905 "compare": false, 00:24:59.905 "compare_and_write": false, 00:24:59.905 "abort": true, 00:24:59.905 "nvme_admin": false, 00:24:59.905 "nvme_io": false 00:24:59.905 }, 00:24:59.905 "memory_domains": [ 00:24:59.905 { 00:24:59.905 "dma_device_id": "system", 00:24:59.905 "dma_device_type": 1 00:24:59.905 }, 00:24:59.905 { 00:24:59.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:59.905 "dma_device_type": 2 00:24:59.905 } 00:24:59.905 ], 00:24:59.905 "driver_specific": {} 00:24:59.905 } 00:24:59.905 ] 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:59.905 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:00.183 "name": "Existed_Raid", 00:25:00.183 "uuid": "e7fa7b91-823c-4157-a9e6-4ab8990adb66", 00:25:00.183 "strip_size_kb": 0, 00:25:00.183 "state": "configuring", 00:25:00.183 "raid_level": "raid1", 00:25:00.183 "superblock": true, 00:25:00.183 "num_base_bdevs": 2, 00:25:00.183 "num_base_bdevs_discovered": 1, 00:25:00.183 "num_base_bdevs_operational": 2, 00:25:00.183 "base_bdevs_list": [ 00:25:00.183 { 00:25:00.183 "name": "BaseBdev1", 00:25:00.183 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:25:00.183 "is_configured": true, 00:25:00.183 "data_offset": 256, 00:25:00.183 "data_size": 7936 00:25:00.183 }, 00:25:00.183 { 00:25:00.183 "name": "BaseBdev2", 00:25:00.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.183 "is_configured": false, 00:25:00.183 "data_offset": 0, 00:25:00.183 "data_size": 0 00:25:00.183 } 00:25:00.183 ] 00:25:00.183 }' 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:00.183 10:20:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:00.751 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:01.011 [2024-06-10 10:20:22.631828] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:01.011 [2024-06-10 10:20:22.631854] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbe3f0 name Existed_Raid, state configuring 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:01.011 [2024-06-10 10:20:22.772210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:01.011 [2024-06-10 10:20:22.773348] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:01.011 [2024-06-10 10:20:22.773371] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.011 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:01.272 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:01.272 "name": "Existed_Raid", 00:25:01.272 "uuid": "98285036-989d-4667-95fa-b706585250af", 00:25:01.272 "strip_size_kb": 0, 00:25:01.272 "state": "configuring", 00:25:01.272 "raid_level": "raid1", 00:25:01.272 "superblock": true, 00:25:01.272 "num_base_bdevs": 2, 00:25:01.272 "num_base_bdevs_discovered": 1, 00:25:01.272 "num_base_bdevs_operational": 2, 00:25:01.272 "base_bdevs_list": [ 00:25:01.272 { 00:25:01.272 "name": "BaseBdev1", 00:25:01.272 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:25:01.272 "is_configured": true, 00:25:01.272 "data_offset": 256, 00:25:01.272 "data_size": 7936 00:25:01.272 }, 00:25:01.272 { 00:25:01.272 "name": "BaseBdev2", 00:25:01.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.272 "is_configured": false, 00:25:01.272 "data_offset": 0, 00:25:01.272 "data_size": 0 00:25:01.272 } 00:25:01.272 ] 00:25:01.272 }' 00:25:01.272 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:01.272 10:20:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:25:01.843 [2024-06-10 10:20:23.695650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:01.843 [2024-06-10 10:20:23.695744] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcbdc00 00:25:01.843 [2024-06-10 10:20:23.695752] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:01.843 [2024-06-10 10:20:23.695791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcbdbd0 00:25:01.843 [2024-06-10 10:20:23.695853] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcbdc00 00:25:01.843 [2024-06-10 10:20:23.695859] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcbdc00 00:25:01.843 [2024-06-10 10:20:23.695898] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.843 BaseBdev2 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:01.843 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:02.103 10:20:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:02.364 [ 00:25:02.364 { 00:25:02.364 "name": "BaseBdev2", 00:25:02.364 "aliases": [ 00:25:02.364 "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982" 00:25:02.364 ], 00:25:02.364 "product_name": "Malloc disk", 00:25:02.364 "block_size": 4128, 00:25:02.364 "num_blocks": 8192, 00:25:02.364 "uuid": "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982", 00:25:02.364 "md_size": 32, 00:25:02.364 "md_interleave": true, 00:25:02.364 "dif_type": 0, 00:25:02.364 "assigned_rate_limits": { 00:25:02.364 "rw_ios_per_sec": 0, 00:25:02.364 "rw_mbytes_per_sec": 0, 00:25:02.364 "r_mbytes_per_sec": 0, 00:25:02.364 "w_mbytes_per_sec": 0 00:25:02.364 }, 00:25:02.364 "claimed": true, 00:25:02.364 "claim_type": "exclusive_write", 00:25:02.364 "zoned": false, 00:25:02.364 "supported_io_types": { 00:25:02.364 "read": true, 00:25:02.364 "write": true, 00:25:02.364 "unmap": true, 00:25:02.364 "write_zeroes": true, 00:25:02.364 "flush": true, 00:25:02.364 "reset": true, 00:25:02.364 "compare": false, 00:25:02.364 "compare_and_write": false, 00:25:02.364 "abort": true, 00:25:02.364 "nvme_admin": false, 00:25:02.364 "nvme_io": false 00:25:02.364 }, 00:25:02.364 "memory_domains": [ 00:25:02.364 { 00:25:02.364 "dma_device_id": "system", 00:25:02.364 "dma_device_type": 1 00:25:02.364 }, 00:25:02.364 { 00:25:02.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.364 "dma_device_type": 2 00:25:02.364 } 00:25:02.364 ], 00:25:02.364 "driver_specific": {} 00:25:02.364 } 00:25:02.364 ] 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:02.364 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.365 "name": "Existed_Raid", 00:25:02.365 "uuid": "98285036-989d-4667-95fa-b706585250af", 00:25:02.365 "strip_size_kb": 0, 00:25:02.365 "state": "online", 00:25:02.365 "raid_level": "raid1", 00:25:02.365 "superblock": true, 00:25:02.365 "num_base_bdevs": 2, 00:25:02.365 "num_base_bdevs_discovered": 2, 00:25:02.365 "num_base_bdevs_operational": 2, 00:25:02.365 "base_bdevs_list": [ 00:25:02.365 { 00:25:02.365 "name": "BaseBdev1", 00:25:02.365 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:25:02.365 "is_configured": true, 00:25:02.365 "data_offset": 256, 00:25:02.365 "data_size": 7936 00:25:02.365 }, 00:25:02.365 { 00:25:02.365 "name": "BaseBdev2", 00:25:02.365 "uuid": "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982", 00:25:02.365 "is_configured": true, 00:25:02.365 "data_offset": 256, 00:25:02.365 "data_size": 7936 00:25:02.365 } 00:25:02.365 ] 00:25:02.365 }' 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.365 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:02.936 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:03.197 [2024-06-10 10:20:24.850777] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:03.197 "name": "Existed_Raid", 00:25:03.197 "aliases": [ 00:25:03.197 "98285036-989d-4667-95fa-b706585250af" 00:25:03.197 ], 00:25:03.197 "product_name": "Raid Volume", 00:25:03.197 "block_size": 4128, 00:25:03.197 "num_blocks": 7936, 00:25:03.197 "uuid": "98285036-989d-4667-95fa-b706585250af", 00:25:03.197 "md_size": 32, 00:25:03.197 "md_interleave": true, 00:25:03.197 "dif_type": 0, 00:25:03.197 "assigned_rate_limits": { 00:25:03.197 "rw_ios_per_sec": 0, 00:25:03.197 "rw_mbytes_per_sec": 0, 00:25:03.197 "r_mbytes_per_sec": 0, 00:25:03.197 "w_mbytes_per_sec": 0 00:25:03.197 }, 00:25:03.197 "claimed": false, 00:25:03.197 "zoned": false, 00:25:03.197 "supported_io_types": { 00:25:03.197 "read": true, 00:25:03.197 "write": true, 00:25:03.197 "unmap": false, 00:25:03.197 "write_zeroes": true, 00:25:03.197 "flush": false, 00:25:03.197 "reset": true, 00:25:03.197 "compare": false, 00:25:03.197 "compare_and_write": false, 00:25:03.197 "abort": false, 00:25:03.197 "nvme_admin": false, 00:25:03.197 "nvme_io": false 00:25:03.197 }, 00:25:03.197 "memory_domains": [ 00:25:03.197 { 00:25:03.197 "dma_device_id": "system", 00:25:03.197 "dma_device_type": 1 00:25:03.197 }, 00:25:03.197 { 00:25:03.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.197 "dma_device_type": 2 00:25:03.197 }, 00:25:03.197 { 00:25:03.197 "dma_device_id": "system", 00:25:03.197 "dma_device_type": 1 00:25:03.197 }, 00:25:03.197 { 00:25:03.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.197 "dma_device_type": 2 00:25:03.197 } 00:25:03.197 ], 00:25:03.197 "driver_specific": { 00:25:03.197 "raid": { 00:25:03.197 "uuid": "98285036-989d-4667-95fa-b706585250af", 00:25:03.197 "strip_size_kb": 0, 00:25:03.197 "state": "online", 00:25:03.197 "raid_level": "raid1", 00:25:03.197 "superblock": true, 00:25:03.197 "num_base_bdevs": 2, 00:25:03.197 "num_base_bdevs_discovered": 2, 00:25:03.197 "num_base_bdevs_operational": 2, 00:25:03.197 "base_bdevs_list": [ 00:25:03.197 { 00:25:03.197 "name": "BaseBdev1", 00:25:03.197 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:25:03.197 "is_configured": true, 00:25:03.197 "data_offset": 256, 00:25:03.197 "data_size": 7936 00:25:03.197 }, 00:25:03.197 { 00:25:03.197 "name": "BaseBdev2", 00:25:03.197 "uuid": "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982", 00:25:03.197 "is_configured": true, 00:25:03.197 "data_offset": 256, 00:25:03.197 "data_size": 7936 00:25:03.197 } 00:25:03.197 ] 00:25:03.197 } 00:25:03.197 } 00:25:03.197 }' 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:03.197 BaseBdev2' 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:03.197 10:20:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:03.458 "name": "BaseBdev1", 00:25:03.458 "aliases": [ 00:25:03.458 "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc" 00:25:03.458 ], 00:25:03.458 "product_name": "Malloc disk", 00:25:03.458 "block_size": 4128, 00:25:03.458 "num_blocks": 8192, 00:25:03.458 "uuid": "7e8bfc30-4ca6-474f-a09b-f7ccd9c37fbc", 00:25:03.458 "md_size": 32, 00:25:03.458 "md_interleave": true, 00:25:03.458 "dif_type": 0, 00:25:03.458 "assigned_rate_limits": { 00:25:03.458 "rw_ios_per_sec": 0, 00:25:03.458 "rw_mbytes_per_sec": 0, 00:25:03.458 "r_mbytes_per_sec": 0, 00:25:03.458 "w_mbytes_per_sec": 0 00:25:03.458 }, 00:25:03.458 "claimed": true, 00:25:03.458 "claim_type": "exclusive_write", 00:25:03.458 "zoned": false, 00:25:03.458 "supported_io_types": { 00:25:03.458 "read": true, 00:25:03.458 "write": true, 00:25:03.458 "unmap": true, 00:25:03.458 "write_zeroes": true, 00:25:03.458 "flush": true, 00:25:03.458 "reset": true, 00:25:03.458 "compare": false, 00:25:03.458 "compare_and_write": false, 00:25:03.458 "abort": true, 00:25:03.458 "nvme_admin": false, 00:25:03.458 "nvme_io": false 00:25:03.458 }, 00:25:03.458 "memory_domains": [ 00:25:03.458 { 00:25:03.458 "dma_device_id": "system", 00:25:03.458 "dma_device_type": 1 00:25:03.458 }, 00:25:03.458 { 00:25:03.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.458 "dma_device_type": 2 00:25:03.458 } 00:25:03.458 ], 00:25:03.458 "driver_specific": {} 00:25:03.458 }' 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.458 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:03.720 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:03.981 "name": "BaseBdev2", 00:25:03.981 "aliases": [ 00:25:03.981 "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982" 00:25:03.981 ], 00:25:03.981 "product_name": "Malloc disk", 00:25:03.981 "block_size": 4128, 00:25:03.981 "num_blocks": 8192, 00:25:03.981 "uuid": "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982", 00:25:03.981 "md_size": 32, 00:25:03.981 "md_interleave": true, 00:25:03.981 "dif_type": 0, 00:25:03.981 "assigned_rate_limits": { 00:25:03.981 "rw_ios_per_sec": 0, 00:25:03.981 "rw_mbytes_per_sec": 0, 00:25:03.981 "r_mbytes_per_sec": 0, 00:25:03.981 "w_mbytes_per_sec": 0 00:25:03.981 }, 00:25:03.981 "claimed": true, 00:25:03.981 "claim_type": "exclusive_write", 00:25:03.981 "zoned": false, 00:25:03.981 "supported_io_types": { 00:25:03.981 "read": true, 00:25:03.981 "write": true, 00:25:03.981 "unmap": true, 00:25:03.981 "write_zeroes": true, 00:25:03.981 "flush": true, 00:25:03.981 "reset": true, 00:25:03.981 "compare": false, 00:25:03.981 "compare_and_write": false, 00:25:03.981 "abort": true, 00:25:03.981 "nvme_admin": false, 00:25:03.981 "nvme_io": false 00:25:03.981 }, 00:25:03.981 "memory_domains": [ 00:25:03.981 { 00:25:03.981 "dma_device_id": "system", 00:25:03.981 "dma_device_type": 1 00:25:03.981 }, 00:25:03.981 { 00:25:03.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.981 "dma_device_type": 2 00:25:03.981 } 00:25:03.981 ], 00:25:03.981 "driver_specific": {} 00:25:03.981 }' 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.981 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:04.242 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:04.242 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.242 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.242 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:04.242 10:20:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:04.503 [2024-06-10 10:20:26.137883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.503 "name": "Existed_Raid", 00:25:04.503 "uuid": "98285036-989d-4667-95fa-b706585250af", 00:25:04.503 "strip_size_kb": 0, 00:25:04.503 "state": "online", 00:25:04.503 "raid_level": "raid1", 00:25:04.503 "superblock": true, 00:25:04.503 "num_base_bdevs": 2, 00:25:04.503 "num_base_bdevs_discovered": 1, 00:25:04.503 "num_base_bdevs_operational": 1, 00:25:04.503 "base_bdevs_list": [ 00:25:04.503 { 00:25:04.503 "name": null, 00:25:04.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.503 "is_configured": false, 00:25:04.503 "data_offset": 256, 00:25:04.503 "data_size": 7936 00:25:04.503 }, 00:25:04.503 { 00:25:04.503 "name": "BaseBdev2", 00:25:04.503 "uuid": "bdadb5dd-47d3-4c11-8bf1-1887e4e2c982", 00:25:04.503 "is_configured": true, 00:25:04.503 "data_offset": 256, 00:25:04.503 "data_size": 7936 00:25:04.503 } 00:25:04.503 ] 00:25:04.503 }' 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.503 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:05.076 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:05.076 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:05.076 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.076 10:20:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:05.337 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:05.337 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:05.337 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:05.597 [2024-06-10 10:20:27.272763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:05.597 [2024-06-10 10:20:27.272830] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:05.597 [2024-06-10 10:20:27.279056] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:05.597 [2024-06-10 10:20:27.279080] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:05.597 [2024-06-10 10:20:27.279085] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbdc00 name Existed_Raid, state offline 00:25:05.597 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:05.597 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:05.597 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.597 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1123454 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1123454 ']' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1123454 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1123454 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1123454' 00:25:05.858 killing process with pid 1123454 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 1123454 00:25:05.858 [2024-06-10 10:20:27.519843] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 1123454 00:25:05.858 [2024-06-10 10:20:27.520427] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:25:05.858 00:25:05.858 real 0m8.600s 00:25:05.858 user 0m15.549s 00:25:05.858 sys 0m1.369s 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:05.858 10:20:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:05.858 ************************************ 00:25:05.858 END TEST raid_state_function_test_sb_md_interleaved 00:25:05.858 ************************************ 00:25:05.858 10:20:27 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:25:05.858 10:20:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:25:05.858 10:20:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:05.858 10:20:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:05.858 ************************************ 00:25:05.858 START TEST raid_superblock_test_md_interleaved 00:25:05.858 ************************************ 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1124963 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1124963 /var/tmp/spdk-raid.sock 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1124963 ']' 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:05.858 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:05.858 10:20:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:06.119 [2024-06-10 10:20:27.769978] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:06.119 [2024-06-10 10:20:27.770028] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1124963 ] 00:25:06.119 [2024-06-10 10:20:27.859525] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:06.119 [2024-06-10 10:20:27.926206] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:06.119 [2024-06-10 10:20:27.972789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:06.119 [2024-06-10 10:20:27.972814] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:25:07.063 malloc1 00:25:07.063 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:07.324 [2024-06-10 10:20:28.963119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:07.324 [2024-06-10 10:20:28.963152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.324 [2024-06-10 10:20:28.963163] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137e860 00:25:07.324 [2024-06-10 10:20:28.963169] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.324 [2024-06-10 10:20:28.964320] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.324 [2024-06-10 10:20:28.964339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:07.324 pt1 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:07.324 10:20:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:25:07.324 malloc2 00:25:07.324 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:07.587 [2024-06-10 10:20:29.318173] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:07.587 [2024-06-10 10:20:29.318200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.587 [2024-06-10 10:20:29.318208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11da830 00:25:07.587 [2024-06-10 10:20:29.318214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.587 [2024-06-10 10:20:29.319449] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.587 [2024-06-10 10:20:29.319467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:07.587 pt2 00:25:07.587 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:07.587 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.587 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:07.848 [2024-06-10 10:20:29.510666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:07.848 [2024-06-10 10:20:29.511866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:07.848 [2024-06-10 10:20:29.511978] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11dbf00 00:25:07.848 [2024-06-10 10:20:29.511987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:07.848 [2024-06-10 10:20:29.512034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11d9330 00:25:07.848 [2024-06-10 10:20:29.512097] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11dbf00 00:25:07.848 [2024-06-10 10:20:29.512103] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11dbf00 00:25:07.848 [2024-06-10 10:20:29.512142] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.848 "name": "raid_bdev1", 00:25:07.848 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:07.848 "strip_size_kb": 0, 00:25:07.848 "state": "online", 00:25:07.848 "raid_level": "raid1", 00:25:07.848 "superblock": true, 00:25:07.848 "num_base_bdevs": 2, 00:25:07.848 "num_base_bdevs_discovered": 2, 00:25:07.848 "num_base_bdevs_operational": 2, 00:25:07.848 "base_bdevs_list": [ 00:25:07.848 { 00:25:07.848 "name": "pt1", 00:25:07.848 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:07.848 "is_configured": true, 00:25:07.848 "data_offset": 256, 00:25:07.848 "data_size": 7936 00:25:07.848 }, 00:25:07.848 { 00:25:07.848 "name": "pt2", 00:25:07.848 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:07.848 "is_configured": true, 00:25:07.848 "data_offset": 256, 00:25:07.848 "data_size": 7936 00:25:07.848 } 00:25:07.848 ] 00:25:07.848 }' 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.848 10:20:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:08.419 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:08.680 [2024-06-10 10:20:30.441191] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:08.680 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:08.680 "name": "raid_bdev1", 00:25:08.680 "aliases": [ 00:25:08.680 "89b42c81-351c-401e-9cc3-627e4e9dd464" 00:25:08.680 ], 00:25:08.680 "product_name": "Raid Volume", 00:25:08.680 "block_size": 4128, 00:25:08.680 "num_blocks": 7936, 00:25:08.680 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:08.680 "md_size": 32, 00:25:08.680 "md_interleave": true, 00:25:08.680 "dif_type": 0, 00:25:08.680 "assigned_rate_limits": { 00:25:08.680 "rw_ios_per_sec": 0, 00:25:08.680 "rw_mbytes_per_sec": 0, 00:25:08.680 "r_mbytes_per_sec": 0, 00:25:08.680 "w_mbytes_per_sec": 0 00:25:08.680 }, 00:25:08.680 "claimed": false, 00:25:08.680 "zoned": false, 00:25:08.680 "supported_io_types": { 00:25:08.680 "read": true, 00:25:08.680 "write": true, 00:25:08.680 "unmap": false, 00:25:08.680 "write_zeroes": true, 00:25:08.680 "flush": false, 00:25:08.680 "reset": true, 00:25:08.680 "compare": false, 00:25:08.680 "compare_and_write": false, 00:25:08.680 "abort": false, 00:25:08.680 "nvme_admin": false, 00:25:08.680 "nvme_io": false 00:25:08.680 }, 00:25:08.680 "memory_domains": [ 00:25:08.680 { 00:25:08.680 "dma_device_id": "system", 00:25:08.680 "dma_device_type": 1 00:25:08.680 }, 00:25:08.680 { 00:25:08.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:08.680 "dma_device_type": 2 00:25:08.680 }, 00:25:08.680 { 00:25:08.680 "dma_device_id": "system", 00:25:08.680 "dma_device_type": 1 00:25:08.680 }, 00:25:08.680 { 00:25:08.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:08.680 "dma_device_type": 2 00:25:08.680 } 00:25:08.680 ], 00:25:08.680 "driver_specific": { 00:25:08.680 "raid": { 00:25:08.680 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:08.680 "strip_size_kb": 0, 00:25:08.680 "state": "online", 00:25:08.680 "raid_level": "raid1", 00:25:08.680 "superblock": true, 00:25:08.680 "num_base_bdevs": 2, 00:25:08.680 "num_base_bdevs_discovered": 2, 00:25:08.680 "num_base_bdevs_operational": 2, 00:25:08.680 "base_bdevs_list": [ 00:25:08.680 { 00:25:08.680 "name": "pt1", 00:25:08.680 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:08.680 "is_configured": true, 00:25:08.680 "data_offset": 256, 00:25:08.680 "data_size": 7936 00:25:08.680 }, 00:25:08.680 { 00:25:08.680 "name": "pt2", 00:25:08.680 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:08.680 "is_configured": true, 00:25:08.681 "data_offset": 256, 00:25:08.681 "data_size": 7936 00:25:08.681 } 00:25:08.681 ] 00:25:08.681 } 00:25:08.681 } 00:25:08.681 }' 00:25:08.681 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:08.681 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:08.681 pt2' 00:25:08.681 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:08.681 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:08.681 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:08.942 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:08.942 "name": "pt1", 00:25:08.942 "aliases": [ 00:25:08.942 "00000000-0000-0000-0000-000000000001" 00:25:08.942 ], 00:25:08.942 "product_name": "passthru", 00:25:08.942 "block_size": 4128, 00:25:08.942 "num_blocks": 8192, 00:25:08.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:08.942 "md_size": 32, 00:25:08.942 "md_interleave": true, 00:25:08.942 "dif_type": 0, 00:25:08.942 "assigned_rate_limits": { 00:25:08.942 "rw_ios_per_sec": 0, 00:25:08.942 "rw_mbytes_per_sec": 0, 00:25:08.942 "r_mbytes_per_sec": 0, 00:25:08.942 "w_mbytes_per_sec": 0 00:25:08.942 }, 00:25:08.942 "claimed": true, 00:25:08.942 "claim_type": "exclusive_write", 00:25:08.942 "zoned": false, 00:25:08.942 "supported_io_types": { 00:25:08.942 "read": true, 00:25:08.942 "write": true, 00:25:08.942 "unmap": true, 00:25:08.942 "write_zeroes": true, 00:25:08.942 "flush": true, 00:25:08.942 "reset": true, 00:25:08.942 "compare": false, 00:25:08.942 "compare_and_write": false, 00:25:08.942 "abort": true, 00:25:08.942 "nvme_admin": false, 00:25:08.942 "nvme_io": false 00:25:08.942 }, 00:25:08.942 "memory_domains": [ 00:25:08.942 { 00:25:08.942 "dma_device_id": "system", 00:25:08.942 "dma_device_type": 1 00:25:08.942 }, 00:25:08.942 { 00:25:08.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:08.942 "dma_device_type": 2 00:25:08.942 } 00:25:08.942 ], 00:25:08.942 "driver_specific": { 00:25:08.942 "passthru": { 00:25:08.942 "name": "pt1", 00:25:08.942 "base_bdev_name": "malloc1" 00:25:08.942 } 00:25:08.942 } 00:25:08.942 }' 00:25:08.942 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:08.942 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:08.942 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:08.942 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:09.203 10:20:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:09.203 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:09.203 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:09.203 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:09.203 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:09.203 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:09.465 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:09.465 "name": "pt2", 00:25:09.465 "aliases": [ 00:25:09.465 "00000000-0000-0000-0000-000000000002" 00:25:09.465 ], 00:25:09.465 "product_name": "passthru", 00:25:09.465 "block_size": 4128, 00:25:09.465 "num_blocks": 8192, 00:25:09.465 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:09.465 "md_size": 32, 00:25:09.465 "md_interleave": true, 00:25:09.465 "dif_type": 0, 00:25:09.465 "assigned_rate_limits": { 00:25:09.465 "rw_ios_per_sec": 0, 00:25:09.465 "rw_mbytes_per_sec": 0, 00:25:09.465 "r_mbytes_per_sec": 0, 00:25:09.465 "w_mbytes_per_sec": 0 00:25:09.465 }, 00:25:09.465 "claimed": true, 00:25:09.465 "claim_type": "exclusive_write", 00:25:09.465 "zoned": false, 00:25:09.465 "supported_io_types": { 00:25:09.465 "read": true, 00:25:09.465 "write": true, 00:25:09.465 "unmap": true, 00:25:09.465 "write_zeroes": true, 00:25:09.465 "flush": true, 00:25:09.465 "reset": true, 00:25:09.465 "compare": false, 00:25:09.465 "compare_and_write": false, 00:25:09.465 "abort": true, 00:25:09.465 "nvme_admin": false, 00:25:09.465 "nvme_io": false 00:25:09.465 }, 00:25:09.465 "memory_domains": [ 00:25:09.465 { 00:25:09.465 "dma_device_id": "system", 00:25:09.465 "dma_device_type": 1 00:25:09.465 }, 00:25:09.465 { 00:25:09.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.465 "dma_device_type": 2 00:25:09.465 } 00:25:09.465 ], 00:25:09.465 "driver_specific": { 00:25:09.465 "passthru": { 00:25:09.465 "name": "pt2", 00:25:09.465 "base_bdev_name": "malloc2" 00:25:09.465 } 00:25:09.465 } 00:25:09.465 }' 00:25:09.465 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:09.465 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:09.726 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:09.988 [2024-06-10 10:20:31.788605] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=89b42c81-351c-401e-9cc3-627e4e9dd464 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 89b42c81-351c-401e-9cc3-627e4e9dd464 ']' 00:25:09.988 10:20:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:10.249 [2024-06-10 10:20:31.984927] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:10.249 [2024-06-10 10:20:31.984938] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:10.249 [2024-06-10 10:20:31.984976] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:10.249 [2024-06-10 10:20:31.985014] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:10.249 [2024-06-10 10:20:31.985020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11dbf00 name raid_bdev1, state offline 00:25:10.249 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.249 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:10.509 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:10.509 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:10.509 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:10.509 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:10.768 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:10.768 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:10.768 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:10.768 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:11.028 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.289 [2024-06-10 10:20:32.915258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:11.289 [2024-06-10 10:20:32.916321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:11.289 [2024-06-10 10:20:32.916364] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:11.289 [2024-06-10 10:20:32.916389] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:11.289 [2024-06-10 10:20:32.916400] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:11.289 [2024-06-10 10:20:32.916404] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11dcbd0 name raid_bdev1, state configuring 00:25:11.289 request: 00:25:11.289 { 00:25:11.289 "name": "raid_bdev1", 00:25:11.289 "raid_level": "raid1", 00:25:11.289 "base_bdevs": [ 00:25:11.289 "malloc1", 00:25:11.289 "malloc2" 00:25:11.289 ], 00:25:11.289 "superblock": false, 00:25:11.289 "method": "bdev_raid_create", 00:25:11.289 "req_id": 1 00:25:11.289 } 00:25:11.289 Got JSON-RPC error response 00:25:11.289 response: 00:25:11.289 { 00:25:11.289 "code": -17, 00:25:11.289 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:11.289 } 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.289 10:20:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:11.289 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:11.289 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:11.289 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:11.549 [2024-06-10 10:20:33.300187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:11.549 [2024-06-10 10:20:33.300210] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.549 [2024-06-10 10:20:33.300219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11dc790 00:25:11.549 [2024-06-10 10:20:33.300225] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.549 [2024-06-10 10:20:33.301315] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.549 [2024-06-10 10:20:33.301333] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:11.549 [2024-06-10 10:20:33.301360] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:11.549 [2024-06-10 10:20:33.301379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:11.549 pt1 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.549 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.810 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:11.810 "name": "raid_bdev1", 00:25:11.810 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:11.810 "strip_size_kb": 0, 00:25:11.810 "state": "configuring", 00:25:11.810 "raid_level": "raid1", 00:25:11.810 "superblock": true, 00:25:11.810 "num_base_bdevs": 2, 00:25:11.810 "num_base_bdevs_discovered": 1, 00:25:11.810 "num_base_bdevs_operational": 2, 00:25:11.810 "base_bdevs_list": [ 00:25:11.810 { 00:25:11.810 "name": "pt1", 00:25:11.810 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:11.810 "is_configured": true, 00:25:11.810 "data_offset": 256, 00:25:11.810 "data_size": 7936 00:25:11.810 }, 00:25:11.810 { 00:25:11.810 "name": null, 00:25:11.810 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:11.810 "is_configured": false, 00:25:11.810 "data_offset": 256, 00:25:11.810 "data_size": 7936 00:25:11.810 } 00:25:11.810 ] 00:25:11.810 }' 00:25:11.810 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:11.810 10:20:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:12.381 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:25:12.381 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:12.381 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:12.381 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:12.381 [2024-06-10 10:20:34.234566] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:12.381 [2024-06-10 10:20:34.234597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.381 [2024-06-10 10:20:34.234608] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11dd900 00:25:12.381 [2024-06-10 10:20:34.234614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.381 [2024-06-10 10:20:34.234732] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.381 [2024-06-10 10:20:34.234742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:12.381 [2024-06-10 10:20:34.234769] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:12.381 [2024-06-10 10:20:34.234780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:12.381 [2024-06-10 10:20:34.234849] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11de1e0 00:25:12.381 [2024-06-10 10:20:34.234860] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:12.381 [2024-06-10 10:20:34.234902] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e2070 00:25:12.381 [2024-06-10 10:20:34.234960] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11de1e0 00:25:12.381 [2024-06-10 10:20:34.234965] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11de1e0 00:25:12.381 [2024-06-10 10:20:34.235007] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.381 pt2 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.642 "name": "raid_bdev1", 00:25:12.642 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:12.642 "strip_size_kb": 0, 00:25:12.642 "state": "online", 00:25:12.642 "raid_level": "raid1", 00:25:12.642 "superblock": true, 00:25:12.642 "num_base_bdevs": 2, 00:25:12.642 "num_base_bdevs_discovered": 2, 00:25:12.642 "num_base_bdevs_operational": 2, 00:25:12.642 "base_bdevs_list": [ 00:25:12.642 { 00:25:12.642 "name": "pt1", 00:25:12.642 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:12.642 "is_configured": true, 00:25:12.642 "data_offset": 256, 00:25:12.642 "data_size": 7936 00:25:12.642 }, 00:25:12.642 { 00:25:12.642 "name": "pt2", 00:25:12.642 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:12.642 "is_configured": true, 00:25:12.642 "data_offset": 256, 00:25:12.642 "data_size": 7936 00:25:12.642 } 00:25:12.642 ] 00:25:12.642 }' 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.642 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:13.214 10:20:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:13.475 [2024-06-10 10:20:35.141033] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:13.475 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:13.475 "name": "raid_bdev1", 00:25:13.475 "aliases": [ 00:25:13.475 "89b42c81-351c-401e-9cc3-627e4e9dd464" 00:25:13.475 ], 00:25:13.475 "product_name": "Raid Volume", 00:25:13.475 "block_size": 4128, 00:25:13.475 "num_blocks": 7936, 00:25:13.475 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:13.475 "md_size": 32, 00:25:13.475 "md_interleave": true, 00:25:13.475 "dif_type": 0, 00:25:13.475 "assigned_rate_limits": { 00:25:13.475 "rw_ios_per_sec": 0, 00:25:13.475 "rw_mbytes_per_sec": 0, 00:25:13.475 "r_mbytes_per_sec": 0, 00:25:13.475 "w_mbytes_per_sec": 0 00:25:13.475 }, 00:25:13.475 "claimed": false, 00:25:13.475 "zoned": false, 00:25:13.475 "supported_io_types": { 00:25:13.475 "read": true, 00:25:13.475 "write": true, 00:25:13.475 "unmap": false, 00:25:13.475 "write_zeroes": true, 00:25:13.475 "flush": false, 00:25:13.475 "reset": true, 00:25:13.475 "compare": false, 00:25:13.475 "compare_and_write": false, 00:25:13.475 "abort": false, 00:25:13.475 "nvme_admin": false, 00:25:13.475 "nvme_io": false 00:25:13.475 }, 00:25:13.475 "memory_domains": [ 00:25:13.475 { 00:25:13.475 "dma_device_id": "system", 00:25:13.475 "dma_device_type": 1 00:25:13.475 }, 00:25:13.475 { 00:25:13.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:13.475 "dma_device_type": 2 00:25:13.475 }, 00:25:13.475 { 00:25:13.475 "dma_device_id": "system", 00:25:13.475 "dma_device_type": 1 00:25:13.475 }, 00:25:13.476 { 00:25:13.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:13.476 "dma_device_type": 2 00:25:13.476 } 00:25:13.476 ], 00:25:13.476 "driver_specific": { 00:25:13.476 "raid": { 00:25:13.476 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:13.476 "strip_size_kb": 0, 00:25:13.476 "state": "online", 00:25:13.476 "raid_level": "raid1", 00:25:13.476 "superblock": true, 00:25:13.476 "num_base_bdevs": 2, 00:25:13.476 "num_base_bdevs_discovered": 2, 00:25:13.476 "num_base_bdevs_operational": 2, 00:25:13.476 "base_bdevs_list": [ 00:25:13.476 { 00:25:13.476 "name": "pt1", 00:25:13.476 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:13.476 "is_configured": true, 00:25:13.476 "data_offset": 256, 00:25:13.476 "data_size": 7936 00:25:13.476 }, 00:25:13.476 { 00:25:13.476 "name": "pt2", 00:25:13.476 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:13.476 "is_configured": true, 00:25:13.476 "data_offset": 256, 00:25:13.476 "data_size": 7936 00:25:13.476 } 00:25:13.476 ] 00:25:13.476 } 00:25:13.476 } 00:25:13.476 }' 00:25:13.476 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:13.476 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:13.476 pt2' 00:25:13.476 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:13.476 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:13.476 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:13.736 "name": "pt1", 00:25:13.736 "aliases": [ 00:25:13.736 "00000000-0000-0000-0000-000000000001" 00:25:13.736 ], 00:25:13.736 "product_name": "passthru", 00:25:13.736 "block_size": 4128, 00:25:13.736 "num_blocks": 8192, 00:25:13.736 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:13.736 "md_size": 32, 00:25:13.736 "md_interleave": true, 00:25:13.736 "dif_type": 0, 00:25:13.736 "assigned_rate_limits": { 00:25:13.736 "rw_ios_per_sec": 0, 00:25:13.736 "rw_mbytes_per_sec": 0, 00:25:13.736 "r_mbytes_per_sec": 0, 00:25:13.736 "w_mbytes_per_sec": 0 00:25:13.736 }, 00:25:13.736 "claimed": true, 00:25:13.736 "claim_type": "exclusive_write", 00:25:13.736 "zoned": false, 00:25:13.736 "supported_io_types": { 00:25:13.736 "read": true, 00:25:13.736 "write": true, 00:25:13.736 "unmap": true, 00:25:13.736 "write_zeroes": true, 00:25:13.736 "flush": true, 00:25:13.736 "reset": true, 00:25:13.736 "compare": false, 00:25:13.736 "compare_and_write": false, 00:25:13.736 "abort": true, 00:25:13.736 "nvme_admin": false, 00:25:13.736 "nvme_io": false 00:25:13.736 }, 00:25:13.736 "memory_domains": [ 00:25:13.736 { 00:25:13.736 "dma_device_id": "system", 00:25:13.736 "dma_device_type": 1 00:25:13.736 }, 00:25:13.736 { 00:25:13.736 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:13.736 "dma_device_type": 2 00:25:13.736 } 00:25:13.736 ], 00:25:13.736 "driver_specific": { 00:25:13.736 "passthru": { 00:25:13.736 "name": "pt1", 00:25:13.736 "base_bdev_name": "malloc1" 00:25:13.736 } 00:25:13.736 } 00:25:13.736 }' 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.736 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.737 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:13.737 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:13.997 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:14.258 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:14.258 "name": "pt2", 00:25:14.258 "aliases": [ 00:25:14.258 "00000000-0000-0000-0000-000000000002" 00:25:14.258 ], 00:25:14.258 "product_name": "passthru", 00:25:14.258 "block_size": 4128, 00:25:14.258 "num_blocks": 8192, 00:25:14.258 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:14.258 "md_size": 32, 00:25:14.258 "md_interleave": true, 00:25:14.258 "dif_type": 0, 00:25:14.258 "assigned_rate_limits": { 00:25:14.258 "rw_ios_per_sec": 0, 00:25:14.258 "rw_mbytes_per_sec": 0, 00:25:14.259 "r_mbytes_per_sec": 0, 00:25:14.259 "w_mbytes_per_sec": 0 00:25:14.259 }, 00:25:14.259 "claimed": true, 00:25:14.259 "claim_type": "exclusive_write", 00:25:14.259 "zoned": false, 00:25:14.259 "supported_io_types": { 00:25:14.259 "read": true, 00:25:14.259 "write": true, 00:25:14.259 "unmap": true, 00:25:14.259 "write_zeroes": true, 00:25:14.259 "flush": true, 00:25:14.259 "reset": true, 00:25:14.259 "compare": false, 00:25:14.259 "compare_and_write": false, 00:25:14.259 "abort": true, 00:25:14.259 "nvme_admin": false, 00:25:14.259 "nvme_io": false 00:25:14.259 }, 00:25:14.259 "memory_domains": [ 00:25:14.259 { 00:25:14.259 "dma_device_id": "system", 00:25:14.259 "dma_device_type": 1 00:25:14.259 }, 00:25:14.259 { 00:25:14.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:14.259 "dma_device_type": 2 00:25:14.259 } 00:25:14.259 ], 00:25:14.259 "driver_specific": { 00:25:14.259 "passthru": { 00:25:14.259 "name": "pt2", 00:25:14.259 "base_bdev_name": "malloc2" 00:25:14.259 } 00:25:14.259 } 00:25:14.259 }' 00:25:14.259 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.259 10:20:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.259 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:14.259 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.259 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.259 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:14.259 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:14.519 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:14.780 [2024-06-10 10:20:36.464357] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:14.780 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 89b42c81-351c-401e-9cc3-627e4e9dd464 '!=' 89b42c81-351c-401e-9cc3-627e4e9dd464 ']' 00:25:14.780 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:25:14.780 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:14.780 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:25:14.780 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:15.040 [2024-06-10 10:20:36.656686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:15.040 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:15.040 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.041 "name": "raid_bdev1", 00:25:15.041 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:15.041 "strip_size_kb": 0, 00:25:15.041 "state": "online", 00:25:15.041 "raid_level": "raid1", 00:25:15.041 "superblock": true, 00:25:15.041 "num_base_bdevs": 2, 00:25:15.041 "num_base_bdevs_discovered": 1, 00:25:15.041 "num_base_bdevs_operational": 1, 00:25:15.041 "base_bdevs_list": [ 00:25:15.041 { 00:25:15.041 "name": null, 00:25:15.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.041 "is_configured": false, 00:25:15.041 "data_offset": 256, 00:25:15.041 "data_size": 7936 00:25:15.041 }, 00:25:15.041 { 00:25:15.041 "name": "pt2", 00:25:15.041 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:15.041 "is_configured": true, 00:25:15.041 "data_offset": 256, 00:25:15.041 "data_size": 7936 00:25:15.041 } 00:25:15.041 ] 00:25:15.041 }' 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.041 10:20:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:15.612 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:15.871 [2024-06-10 10:20:37.554934] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:15.871 [2024-06-10 10:20:37.554950] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:15.871 [2024-06-10 10:20:37.554984] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:15.871 [2024-06-10 10:20:37.555015] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:15.871 [2024-06-10 10:20:37.555021] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11de1e0 name raid_bdev1, state offline 00:25:15.871 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:25:15.871 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:25:16.131 10:20:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:16.392 [2024-06-10 10:20:38.112328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:16.392 [2024-06-10 10:20:38.112356] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.392 [2024-06-10 10:20:38.112364] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11daf60 00:25:16.392 [2024-06-10 10:20:38.112371] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.392 [2024-06-10 10:20:38.113498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.392 [2024-06-10 10:20:38.113516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:16.392 [2024-06-10 10:20:38.113548] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:16.392 [2024-06-10 10:20:38.113564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:16.392 [2024-06-10 10:20:38.113615] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11debe0 00:25:16.392 [2024-06-10 10:20:38.113621] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:16.392 [2024-06-10 10:20:38.113662] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11db1f0 00:25:16.392 [2024-06-10 10:20:38.113716] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11debe0 00:25:16.392 [2024-06-10 10:20:38.113721] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11debe0 00:25:16.392 [2024-06-10 10:20:38.113760] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.392 pt2 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.392 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.653 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.653 "name": "raid_bdev1", 00:25:16.653 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:16.653 "strip_size_kb": 0, 00:25:16.653 "state": "online", 00:25:16.653 "raid_level": "raid1", 00:25:16.653 "superblock": true, 00:25:16.653 "num_base_bdevs": 2, 00:25:16.653 "num_base_bdevs_discovered": 1, 00:25:16.653 "num_base_bdevs_operational": 1, 00:25:16.653 "base_bdevs_list": [ 00:25:16.653 { 00:25:16.653 "name": null, 00:25:16.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.653 "is_configured": false, 00:25:16.653 "data_offset": 256, 00:25:16.653 "data_size": 7936 00:25:16.653 }, 00:25:16.653 { 00:25:16.653 "name": "pt2", 00:25:16.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:16.653 "is_configured": true, 00:25:16.653 "data_offset": 256, 00:25:16.653 "data_size": 7936 00:25:16.653 } 00:25:16.653 ] 00:25:16.653 }' 00:25:16.653 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.653 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:17.224 10:20:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:17.224 [2024-06-10 10:20:39.046683] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:17.224 [2024-06-10 10:20:39.046697] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:17.224 [2024-06-10 10:20:39.046732] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:17.224 [2024-06-10 10:20:39.046761] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:17.224 [2024-06-10 10:20:39.046766] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11debe0 name raid_bdev1, state offline 00:25:17.224 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.225 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:25:17.485 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:25:17.485 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:25:17.485 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:25:17.485 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:17.746 [2024-06-10 10:20:39.411591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:17.746 [2024-06-10 10:20:39.411617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.746 [2024-06-10 10:20:39.411626] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ddb30 00:25:17.746 [2024-06-10 10:20:39.411632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.746 [2024-06-10 10:20:39.412757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.746 [2024-06-10 10:20:39.412775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:17.746 [2024-06-10 10:20:39.412806] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:17.746 [2024-06-10 10:20:39.412828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:17.746 [2024-06-10 10:20:39.412888] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:17.746 [2024-06-10 10:20:39.412899] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:17.746 [2024-06-10 10:20:39.412907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e1b00 name raid_bdev1, state configuring 00:25:17.746 [2024-06-10 10:20:39.412922] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:17.746 [2024-06-10 10:20:39.412956] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11e2bf0 00:25:17.746 [2024-06-10 10:20:39.412961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:17.746 [2024-06-10 10:20:39.412999] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e0480 00:25:17.746 [2024-06-10 10:20:39.413052] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11e2bf0 00:25:17.746 [2024-06-10 10:20:39.413057] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11e2bf0 00:25:17.746 [2024-06-10 10:20:39.413101] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:17.746 pt1 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.746 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.747 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.007 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:18.007 "name": "raid_bdev1", 00:25:18.007 "uuid": "89b42c81-351c-401e-9cc3-627e4e9dd464", 00:25:18.007 "strip_size_kb": 0, 00:25:18.007 "state": "online", 00:25:18.007 "raid_level": "raid1", 00:25:18.007 "superblock": true, 00:25:18.007 "num_base_bdevs": 2, 00:25:18.007 "num_base_bdevs_discovered": 1, 00:25:18.007 "num_base_bdevs_operational": 1, 00:25:18.007 "base_bdevs_list": [ 00:25:18.007 { 00:25:18.007 "name": null, 00:25:18.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.007 "is_configured": false, 00:25:18.007 "data_offset": 256, 00:25:18.007 "data_size": 7936 00:25:18.007 }, 00:25:18.007 { 00:25:18.007 "name": "pt2", 00:25:18.007 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:18.007 "is_configured": true, 00:25:18.007 "data_offset": 256, 00:25:18.007 "data_size": 7936 00:25:18.007 } 00:25:18.007 ] 00:25:18.007 }' 00:25:18.007 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:18.007 10:20:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:18.577 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:18.577 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:18.577 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:25:18.577 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:18.577 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:25:18.857 [2024-06-10 10:20:40.522575] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 89b42c81-351c-401e-9cc3-627e4e9dd464 '!=' 89b42c81-351c-401e-9cc3-627e4e9dd464 ']' 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1124963 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1124963 ']' 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1124963 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1124963 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1124963' 00:25:18.857 killing process with pid 1124963 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # kill 1124963 00:25:18.857 [2024-06-10 10:20:40.592394] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:18.857 [2024-06-10 10:20:40.592439] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:18.857 [2024-06-10 10:20:40.592476] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:18.857 [2024-06-10 10:20:40.592483] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e2bf0 name raid_bdev1, state offline 00:25:18.857 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@973 -- # wait 1124963 00:25:18.857 [2024-06-10 10:20:40.607402] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:19.128 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:25:19.128 00:25:19.128 real 0m13.087s 00:25:19.128 user 0m24.183s 00:25:19.129 sys 0m1.974s 00:25:19.129 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:19.129 10:20:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:19.129 ************************************ 00:25:19.129 END TEST raid_superblock_test_md_interleaved 00:25:19.129 ************************************ 00:25:19.129 10:20:40 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:25:19.129 10:20:40 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:19.129 10:20:40 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:19.129 10:20:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:19.129 ************************************ 00:25:19.129 START TEST raid_rebuild_test_sb_md_interleaved 00:25:19.129 ************************************ 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false false 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1127429 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1127429 /var/tmp/spdk-raid.sock 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1127429 ']' 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:19.129 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:19.129 10:20:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:19.129 [2024-06-10 10:20:40.952536] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:19.129 [2024-06-10 10:20:40.952597] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1127429 ] 00:25:19.129 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:19.129 Zero copy mechanism will not be used. 00:25:19.390 [2024-06-10 10:20:41.045456] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:19.390 [2024-06-10 10:20:41.113558] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:19.390 [2024-06-10 10:20:41.169118] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.390 [2024-06-10 10:20:41.169145] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.961 10:20:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:19.961 10:20:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:25:19.961 10:20:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:19.961 10:20:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:25:20.222 BaseBdev1_malloc 00:25:20.222 10:20:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:20.481 [2024-06-10 10:20:42.147413] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:20.481 [2024-06-10 10:20:42.147451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.481 [2024-06-10 10:20:42.147464] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x174c920 00:25:20.481 [2024-06-10 10:20:42.147470] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.481 [2024-06-10 10:20:42.148613] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.481 [2024-06-10 10:20:42.148632] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:20.481 BaseBdev1 00:25:20.481 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:20.481 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:25:20.481 BaseBdev2_malloc 00:25:20.741 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:20.741 [2024-06-10 10:20:42.522227] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:20.741 [2024-06-10 10:20:42.522254] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.741 [2024-06-10 10:20:42.522266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1744140 00:25:20.741 [2024-06-10 10:20:42.522272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.741 [2024-06-10 10:20:42.523498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.741 [2024-06-10 10:20:42.523516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:20.741 BaseBdev2 00:25:20.741 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:25:21.001 spare_malloc 00:25:21.001 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:21.262 spare_delay 00:25:21.262 10:20:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:21.262 [2024-06-10 10:20:43.065549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:21.262 [2024-06-10 10:20:43.065580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.262 [2024-06-10 10:20:43.065592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17471f0 00:25:21.262 [2024-06-10 10:20:43.065598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.262 [2024-06-10 10:20:43.066635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.262 [2024-06-10 10:20:43.066652] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:21.262 spare 00:25:21.262 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:21.522 [2024-06-10 10:20:43.254048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:21.522 [2024-06-10 10:20:43.255025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:21.522 [2024-06-10 10:20:43.255147] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17478d0 00:25:21.522 [2024-06-10 10:20:43.255156] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:21.522 [2024-06-10 10:20:43.255200] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15af600 00:25:21.522 [2024-06-10 10:20:43.255262] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17478d0 00:25:21.522 [2024-06-10 10:20:43.255267] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17478d0 00:25:21.522 [2024-06-10 10:20:43.255305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.522 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.784 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:21.784 "name": "raid_bdev1", 00:25:21.784 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:21.784 "strip_size_kb": 0, 00:25:21.784 "state": "online", 00:25:21.784 "raid_level": "raid1", 00:25:21.784 "superblock": true, 00:25:21.784 "num_base_bdevs": 2, 00:25:21.784 "num_base_bdevs_discovered": 2, 00:25:21.784 "num_base_bdevs_operational": 2, 00:25:21.784 "base_bdevs_list": [ 00:25:21.784 { 00:25:21.784 "name": "BaseBdev1", 00:25:21.784 "uuid": "3b04a1ce-f7e3-53e1-b2ea-9d91cffcc9b2", 00:25:21.784 "is_configured": true, 00:25:21.784 "data_offset": 256, 00:25:21.784 "data_size": 7936 00:25:21.784 }, 00:25:21.784 { 00:25:21.784 "name": "BaseBdev2", 00:25:21.784 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:21.784 "is_configured": true, 00:25:21.784 "data_offset": 256, 00:25:21.784 "data_size": 7936 00:25:21.784 } 00:25:21.784 ] 00:25:21.784 }' 00:25:21.784 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:21.784 10:20:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:22.355 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:22.355 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:22.355 [2024-06-10 10:20:44.188569] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:22.355 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:25:22.355 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:22.355 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.616 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:25:22.616 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:22.616 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:25:22.616 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:22.877 [2024-06-10 10:20:44.561325] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.877 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.138 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.138 "name": "raid_bdev1", 00:25:23.138 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:23.138 "strip_size_kb": 0, 00:25:23.138 "state": "online", 00:25:23.138 "raid_level": "raid1", 00:25:23.138 "superblock": true, 00:25:23.138 "num_base_bdevs": 2, 00:25:23.138 "num_base_bdevs_discovered": 1, 00:25:23.138 "num_base_bdevs_operational": 1, 00:25:23.138 "base_bdevs_list": [ 00:25:23.138 { 00:25:23.138 "name": null, 00:25:23.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.138 "is_configured": false, 00:25:23.138 "data_offset": 256, 00:25:23.138 "data_size": 7936 00:25:23.138 }, 00:25:23.138 { 00:25:23.138 "name": "BaseBdev2", 00:25:23.138 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:23.138 "is_configured": true, 00:25:23.138 "data_offset": 256, 00:25:23.138 "data_size": 7936 00:25:23.138 } 00:25:23.138 ] 00:25:23.138 }' 00:25:23.138 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.138 10:20:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:23.710 10:20:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:23.710 [2024-06-10 10:20:45.447587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:23.710 [2024-06-10 10:20:45.450078] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1749290 00:25:23.710 [2024-06-10 10:20:45.451458] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:23.710 10:20:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.652 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.912 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.912 "name": "raid_bdev1", 00:25:24.912 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:24.912 "strip_size_kb": 0, 00:25:24.913 "state": "online", 00:25:24.913 "raid_level": "raid1", 00:25:24.913 "superblock": true, 00:25:24.913 "num_base_bdevs": 2, 00:25:24.913 "num_base_bdevs_discovered": 2, 00:25:24.913 "num_base_bdevs_operational": 2, 00:25:24.913 "process": { 00:25:24.913 "type": "rebuild", 00:25:24.913 "target": "spare", 00:25:24.913 "progress": { 00:25:24.913 "blocks": 2816, 00:25:24.913 "percent": 35 00:25:24.913 } 00:25:24.913 }, 00:25:24.913 "base_bdevs_list": [ 00:25:24.913 { 00:25:24.913 "name": "spare", 00:25:24.913 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:24.913 "is_configured": true, 00:25:24.913 "data_offset": 256, 00:25:24.913 "data_size": 7936 00:25:24.913 }, 00:25:24.913 { 00:25:24.913 "name": "BaseBdev2", 00:25:24.913 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:24.913 "is_configured": true, 00:25:24.913 "data_offset": 256, 00:25:24.913 "data_size": 7936 00:25:24.913 } 00:25:24.913 ] 00:25:24.913 }' 00:25:24.913 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.913 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:24.913 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.913 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.913 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:25.174 [2024-06-10 10:20:46.924277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.174 [2024-06-10 10:20:46.960356] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:25.174 [2024-06-10 10:20:46.960389] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.174 [2024-06-10 10:20:46.960399] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.174 [2024-06-10 10:20:46.960403] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.174 10:20:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.436 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.436 "name": "raid_bdev1", 00:25:25.436 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:25.436 "strip_size_kb": 0, 00:25:25.436 "state": "online", 00:25:25.436 "raid_level": "raid1", 00:25:25.436 "superblock": true, 00:25:25.436 "num_base_bdevs": 2, 00:25:25.436 "num_base_bdevs_discovered": 1, 00:25:25.436 "num_base_bdevs_operational": 1, 00:25:25.436 "base_bdevs_list": [ 00:25:25.436 { 00:25:25.436 "name": null, 00:25:25.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.436 "is_configured": false, 00:25:25.436 "data_offset": 256, 00:25:25.436 "data_size": 7936 00:25:25.436 }, 00:25:25.436 { 00:25:25.436 "name": "BaseBdev2", 00:25:25.436 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:25.436 "is_configured": true, 00:25:25.436 "data_offset": 256, 00:25:25.436 "data_size": 7936 00:25:25.436 } 00:25:25.436 ] 00:25:25.436 }' 00:25:25.436 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.436 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.007 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.268 "name": "raid_bdev1", 00:25:26.268 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:26.268 "strip_size_kb": 0, 00:25:26.268 "state": "online", 00:25:26.268 "raid_level": "raid1", 00:25:26.268 "superblock": true, 00:25:26.268 "num_base_bdevs": 2, 00:25:26.268 "num_base_bdevs_discovered": 1, 00:25:26.268 "num_base_bdevs_operational": 1, 00:25:26.268 "base_bdevs_list": [ 00:25:26.268 { 00:25:26.268 "name": null, 00:25:26.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.268 "is_configured": false, 00:25:26.268 "data_offset": 256, 00:25:26.268 "data_size": 7936 00:25:26.268 }, 00:25:26.268 { 00:25:26.268 "name": "BaseBdev2", 00:25:26.268 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:26.268 "is_configured": true, 00:25:26.268 "data_offset": 256, 00:25:26.268 "data_size": 7936 00:25:26.268 } 00:25:26.268 ] 00:25:26.268 }' 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.268 10:20:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:26.529 [2024-06-10 10:20:48.155411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:26.529 [2024-06-10 10:20:48.157920] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1747cb0 00:25:26.529 [2024-06-10 10:20:48.159091] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:26.529 10:20:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.471 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.732 "name": "raid_bdev1", 00:25:27.732 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:27.732 "strip_size_kb": 0, 00:25:27.732 "state": "online", 00:25:27.732 "raid_level": "raid1", 00:25:27.732 "superblock": true, 00:25:27.732 "num_base_bdevs": 2, 00:25:27.732 "num_base_bdevs_discovered": 2, 00:25:27.732 "num_base_bdevs_operational": 2, 00:25:27.732 "process": { 00:25:27.732 "type": "rebuild", 00:25:27.732 "target": "spare", 00:25:27.732 "progress": { 00:25:27.732 "blocks": 2816, 00:25:27.732 "percent": 35 00:25:27.732 } 00:25:27.732 }, 00:25:27.732 "base_bdevs_list": [ 00:25:27.732 { 00:25:27.732 "name": "spare", 00:25:27.732 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:27.732 "is_configured": true, 00:25:27.732 "data_offset": 256, 00:25:27.732 "data_size": 7936 00:25:27.732 }, 00:25:27.732 { 00:25:27.732 "name": "BaseBdev2", 00:25:27.732 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:27.732 "is_configured": true, 00:25:27.732 "data_offset": 256, 00:25:27.732 "data_size": 7936 00:25:27.732 } 00:25:27.732 ] 00:25:27.732 }' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:27.732 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=941 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.732 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.994 "name": "raid_bdev1", 00:25:27.994 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:27.994 "strip_size_kb": 0, 00:25:27.994 "state": "online", 00:25:27.994 "raid_level": "raid1", 00:25:27.994 "superblock": true, 00:25:27.994 "num_base_bdevs": 2, 00:25:27.994 "num_base_bdevs_discovered": 2, 00:25:27.994 "num_base_bdevs_operational": 2, 00:25:27.994 "process": { 00:25:27.994 "type": "rebuild", 00:25:27.994 "target": "spare", 00:25:27.994 "progress": { 00:25:27.994 "blocks": 3584, 00:25:27.994 "percent": 45 00:25:27.994 } 00:25:27.994 }, 00:25:27.994 "base_bdevs_list": [ 00:25:27.994 { 00:25:27.994 "name": "spare", 00:25:27.994 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:27.994 "is_configured": true, 00:25:27.994 "data_offset": 256, 00:25:27.994 "data_size": 7936 00:25:27.994 }, 00:25:27.994 { 00:25:27.994 "name": "BaseBdev2", 00:25:27.994 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:27.994 "is_configured": true, 00:25:27.994 "data_offset": 256, 00:25:27.994 "data_size": 7936 00:25:27.994 } 00:25:27.994 ] 00:25:27.994 }' 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:27.994 10:20:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.936 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.198 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.198 "name": "raid_bdev1", 00:25:29.198 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:29.198 "strip_size_kb": 0, 00:25:29.198 "state": "online", 00:25:29.198 "raid_level": "raid1", 00:25:29.198 "superblock": true, 00:25:29.198 "num_base_bdevs": 2, 00:25:29.198 "num_base_bdevs_discovered": 2, 00:25:29.198 "num_base_bdevs_operational": 2, 00:25:29.198 "process": { 00:25:29.198 "type": "rebuild", 00:25:29.198 "target": "spare", 00:25:29.198 "progress": { 00:25:29.198 "blocks": 6912, 00:25:29.198 "percent": 87 00:25:29.198 } 00:25:29.198 }, 00:25:29.198 "base_bdevs_list": [ 00:25:29.198 { 00:25:29.198 "name": "spare", 00:25:29.198 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:29.198 "is_configured": true, 00:25:29.198 "data_offset": 256, 00:25:29.198 "data_size": 7936 00:25:29.198 }, 00:25:29.198 { 00:25:29.198 "name": "BaseBdev2", 00:25:29.198 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:29.198 "is_configured": true, 00:25:29.198 "data_offset": 256, 00:25:29.198 "data_size": 7936 00:25:29.198 } 00:25:29.198 ] 00:25:29.198 }' 00:25:29.198 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.198 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.198 10:20:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.198 10:20:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.198 10:20:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:29.459 [2024-06-10 10:20:51.277265] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:29.459 [2024-06-10 10:20:51.277311] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:29.459 [2024-06-10 10:20:51.277373] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.400 "name": "raid_bdev1", 00:25:30.400 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:30.400 "strip_size_kb": 0, 00:25:30.400 "state": "online", 00:25:30.400 "raid_level": "raid1", 00:25:30.400 "superblock": true, 00:25:30.400 "num_base_bdevs": 2, 00:25:30.400 "num_base_bdevs_discovered": 2, 00:25:30.400 "num_base_bdevs_operational": 2, 00:25:30.400 "base_bdevs_list": [ 00:25:30.400 { 00:25:30.400 "name": "spare", 00:25:30.400 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:30.400 "is_configured": true, 00:25:30.400 "data_offset": 256, 00:25:30.400 "data_size": 7936 00:25:30.400 }, 00:25:30.400 { 00:25:30.400 "name": "BaseBdev2", 00:25:30.400 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:30.400 "is_configured": true, 00:25:30.400 "data_offset": 256, 00:25:30.400 "data_size": 7936 00:25:30.400 } 00:25:30.400 ] 00:25:30.400 }' 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.400 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.659 "name": "raid_bdev1", 00:25:30.659 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:30.659 "strip_size_kb": 0, 00:25:30.659 "state": "online", 00:25:30.659 "raid_level": "raid1", 00:25:30.659 "superblock": true, 00:25:30.659 "num_base_bdevs": 2, 00:25:30.659 "num_base_bdevs_discovered": 2, 00:25:30.659 "num_base_bdevs_operational": 2, 00:25:30.659 "base_bdevs_list": [ 00:25:30.659 { 00:25:30.659 "name": "spare", 00:25:30.659 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:30.659 "is_configured": true, 00:25:30.659 "data_offset": 256, 00:25:30.659 "data_size": 7936 00:25:30.659 }, 00:25:30.659 { 00:25:30.659 "name": "BaseBdev2", 00:25:30.659 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:30.659 "is_configured": true, 00:25:30.659 "data_offset": 256, 00:25:30.659 "data_size": 7936 00:25:30.659 } 00:25:30.659 ] 00:25:30.659 }' 00:25:30.659 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.919 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.179 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.179 "name": "raid_bdev1", 00:25:31.179 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:31.179 "strip_size_kb": 0, 00:25:31.179 "state": "online", 00:25:31.179 "raid_level": "raid1", 00:25:31.179 "superblock": true, 00:25:31.179 "num_base_bdevs": 2, 00:25:31.179 "num_base_bdevs_discovered": 2, 00:25:31.179 "num_base_bdevs_operational": 2, 00:25:31.179 "base_bdevs_list": [ 00:25:31.179 { 00:25:31.179 "name": "spare", 00:25:31.179 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:31.179 "is_configured": true, 00:25:31.179 "data_offset": 256, 00:25:31.179 "data_size": 7936 00:25:31.179 }, 00:25:31.179 { 00:25:31.179 "name": "BaseBdev2", 00:25:31.179 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:31.179 "is_configured": true, 00:25:31.179 "data_offset": 256, 00:25:31.179 "data_size": 7936 00:25:31.179 } 00:25:31.179 ] 00:25:31.179 }' 00:25:31.179 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.179 10:20:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:31.749 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:31.749 [2024-06-10 10:20:53.479137] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:31.749 [2024-06-10 10:20:53.479157] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:31.750 [2024-06-10 10:20:53.479201] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:31.750 [2024-06-10 10:20:53.479241] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:31.750 [2024-06-10 10:20:53.479247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17478d0 name raid_bdev1, state offline 00:25:31.750 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.750 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:25:32.010 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:32.010 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:25:32.010 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:32.010 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:32.270 10:20:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:32.270 [2024-06-10 10:20:54.036506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:32.270 [2024-06-10 10:20:54.036533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.270 [2024-06-10 10:20:54.036545] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1747420 00:25:32.270 [2024-06-10 10:20:54.036551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.270 [2024-06-10 10:20:54.037720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.270 [2024-06-10 10:20:54.037740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:32.270 [2024-06-10 10:20:54.037783] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:32.270 [2024-06-10 10:20:54.037801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:32.270 [2024-06-10 10:20:54.037871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:32.270 spare 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.270 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.530 [2024-06-10 10:20:54.138155] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1748440 00:25:32.530 [2024-06-10 10:20:54.138163] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:32.530 [2024-06-10 10:20:54.138214] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x174b120 00:25:32.530 [2024-06-10 10:20:54.138282] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1748440 00:25:32.530 [2024-06-10 10:20:54.138288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1748440 00:25:32.530 [2024-06-10 10:20:54.138332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.531 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.531 "name": "raid_bdev1", 00:25:32.531 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:32.531 "strip_size_kb": 0, 00:25:32.531 "state": "online", 00:25:32.531 "raid_level": "raid1", 00:25:32.531 "superblock": true, 00:25:32.531 "num_base_bdevs": 2, 00:25:32.531 "num_base_bdevs_discovered": 2, 00:25:32.531 "num_base_bdevs_operational": 2, 00:25:32.531 "base_bdevs_list": [ 00:25:32.531 { 00:25:32.531 "name": "spare", 00:25:32.531 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:32.531 "is_configured": true, 00:25:32.531 "data_offset": 256, 00:25:32.531 "data_size": 7936 00:25:32.531 }, 00:25:32.531 { 00:25:32.531 "name": "BaseBdev2", 00:25:32.531 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:32.531 "is_configured": true, 00:25:32.531 "data_offset": 256, 00:25:32.531 "data_size": 7936 00:25:32.531 } 00:25:32.531 ] 00:25:32.531 }' 00:25:32.531 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.531 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.101 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.360 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.360 "name": "raid_bdev1", 00:25:33.360 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:33.360 "strip_size_kb": 0, 00:25:33.360 "state": "online", 00:25:33.360 "raid_level": "raid1", 00:25:33.360 "superblock": true, 00:25:33.360 "num_base_bdevs": 2, 00:25:33.360 "num_base_bdevs_discovered": 2, 00:25:33.360 "num_base_bdevs_operational": 2, 00:25:33.360 "base_bdevs_list": [ 00:25:33.360 { 00:25:33.360 "name": "spare", 00:25:33.360 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:33.360 "is_configured": true, 00:25:33.360 "data_offset": 256, 00:25:33.360 "data_size": 7936 00:25:33.360 }, 00:25:33.360 { 00:25:33.360 "name": "BaseBdev2", 00:25:33.360 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:33.360 "is_configured": true, 00:25:33.360 "data_offset": 256, 00:25:33.360 "data_size": 7936 00:25:33.360 } 00:25:33.360 ] 00:25:33.360 }' 00:25:33.360 10:20:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.360 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:33.360 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.360 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:33.360 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:33.361 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:33.620 [2024-06-10 10:20:55.428108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.620 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.879 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.879 "name": "raid_bdev1", 00:25:33.879 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:33.879 "strip_size_kb": 0, 00:25:33.879 "state": "online", 00:25:33.879 "raid_level": "raid1", 00:25:33.879 "superblock": true, 00:25:33.879 "num_base_bdevs": 2, 00:25:33.879 "num_base_bdevs_discovered": 1, 00:25:33.879 "num_base_bdevs_operational": 1, 00:25:33.879 "base_bdevs_list": [ 00:25:33.879 { 00:25:33.879 "name": null, 00:25:33.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.879 "is_configured": false, 00:25:33.879 "data_offset": 256, 00:25:33.879 "data_size": 7936 00:25:33.879 }, 00:25:33.879 { 00:25:33.879 "name": "BaseBdev2", 00:25:33.879 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:33.879 "is_configured": true, 00:25:33.879 "data_offset": 256, 00:25:33.879 "data_size": 7936 00:25:33.879 } 00:25:33.879 ] 00:25:33.879 }' 00:25:33.879 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.879 10:20:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:34.450 10:20:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:34.450 [2024-06-10 10:20:56.306331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.450 [2024-06-10 10:20:56.306445] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:34.450 [2024-06-10 10:20:56.306453] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:34.450 [2024-06-10 10:20:56.306471] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.450 [2024-06-10 10:20:56.308930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1731a60 00:25:34.450 [2024-06-10 10:20:56.310059] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:34.711 10:20:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.651 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.912 "name": "raid_bdev1", 00:25:35.912 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:35.912 "strip_size_kb": 0, 00:25:35.912 "state": "online", 00:25:35.912 "raid_level": "raid1", 00:25:35.912 "superblock": true, 00:25:35.912 "num_base_bdevs": 2, 00:25:35.912 "num_base_bdevs_discovered": 2, 00:25:35.912 "num_base_bdevs_operational": 2, 00:25:35.912 "process": { 00:25:35.912 "type": "rebuild", 00:25:35.912 "target": "spare", 00:25:35.912 "progress": { 00:25:35.912 "blocks": 2816, 00:25:35.912 "percent": 35 00:25:35.912 } 00:25:35.912 }, 00:25:35.912 "base_bdevs_list": [ 00:25:35.912 { 00:25:35.912 "name": "spare", 00:25:35.912 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:35.912 "is_configured": true, 00:25:35.912 "data_offset": 256, 00:25:35.912 "data_size": 7936 00:25:35.912 }, 00:25:35.912 { 00:25:35.912 "name": "BaseBdev2", 00:25:35.912 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:35.912 "is_configured": true, 00:25:35.912 "data_offset": 256, 00:25:35.912 "data_size": 7936 00:25:35.912 } 00:25:35.912 ] 00:25:35.912 }' 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:35.912 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:36.172 [2024-06-10 10:20:57.786902] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.172 [2024-06-10 10:20:57.818934] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:36.172 [2024-06-10 10:20:57.818965] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.172 [2024-06-10 10:20:57.818974] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.172 [2024-06-10 10:20:57.818978] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.172 10:20:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.172 10:20:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.172 "name": "raid_bdev1", 00:25:36.172 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:36.172 "strip_size_kb": 0, 00:25:36.172 "state": "online", 00:25:36.172 "raid_level": "raid1", 00:25:36.172 "superblock": true, 00:25:36.172 "num_base_bdevs": 2, 00:25:36.172 "num_base_bdevs_discovered": 1, 00:25:36.172 "num_base_bdevs_operational": 1, 00:25:36.172 "base_bdevs_list": [ 00:25:36.172 { 00:25:36.172 "name": null, 00:25:36.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.172 "is_configured": false, 00:25:36.172 "data_offset": 256, 00:25:36.172 "data_size": 7936 00:25:36.172 }, 00:25:36.172 { 00:25:36.172 "name": "BaseBdev2", 00:25:36.172 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:36.172 "is_configured": true, 00:25:36.172 "data_offset": 256, 00:25:36.172 "data_size": 7936 00:25:36.172 } 00:25:36.172 ] 00:25:36.172 }' 00:25:36.172 10:20:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.172 10:20:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:36.741 10:20:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:37.001 [2024-06-10 10:20:58.757330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:37.001 [2024-06-10 10:20:58.757365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.001 [2024-06-10 10:20:58.757382] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1731da0 00:25:37.001 [2024-06-10 10:20:58.757388] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.001 [2024-06-10 10:20:58.757548] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.001 [2024-06-10 10:20:58.757558] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:37.001 [2024-06-10 10:20:58.757597] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:37.001 [2024-06-10 10:20:58.757603] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:37.001 [2024-06-10 10:20:58.757608] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:37.001 [2024-06-10 10:20:58.757620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.001 [2024-06-10 10:20:58.760029] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1731a60 00:25:37.001 [2024-06-10 10:20:58.761130] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:37.001 spare 00:25:37.001 10:20:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.940 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.199 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.199 "name": "raid_bdev1", 00:25:38.199 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:38.199 "strip_size_kb": 0, 00:25:38.199 "state": "online", 00:25:38.199 "raid_level": "raid1", 00:25:38.199 "superblock": true, 00:25:38.199 "num_base_bdevs": 2, 00:25:38.199 "num_base_bdevs_discovered": 2, 00:25:38.199 "num_base_bdevs_operational": 2, 00:25:38.199 "process": { 00:25:38.199 "type": "rebuild", 00:25:38.199 "target": "spare", 00:25:38.199 "progress": { 00:25:38.199 "blocks": 2816, 00:25:38.199 "percent": 35 00:25:38.199 } 00:25:38.199 }, 00:25:38.199 "base_bdevs_list": [ 00:25:38.199 { 00:25:38.199 "name": "spare", 00:25:38.199 "uuid": "769f883e-4777-50e4-82fb-1c07291227fb", 00:25:38.199 "is_configured": true, 00:25:38.199 "data_offset": 256, 00:25:38.199 "data_size": 7936 00:25:38.199 }, 00:25:38.199 { 00:25:38.199 "name": "BaseBdev2", 00:25:38.199 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:38.199 "is_configured": true, 00:25:38.199 "data_offset": 256, 00:25:38.199 "data_size": 7936 00:25:38.199 } 00:25:38.199 ] 00:25:38.199 }' 00:25:38.199 10:20:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.199 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:38.199 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:38.459 [2024-06-10 10:21:00.246016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.459 [2024-06-10 10:21:00.270020] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:38.459 [2024-06-10 10:21:00.270055] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.459 [2024-06-10 10:21:00.270065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.459 [2024-06-10 10:21:00.270069] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.459 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.734 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.734 "name": "raid_bdev1", 00:25:38.734 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:38.734 "strip_size_kb": 0, 00:25:38.734 "state": "online", 00:25:38.734 "raid_level": "raid1", 00:25:38.734 "superblock": true, 00:25:38.734 "num_base_bdevs": 2, 00:25:38.734 "num_base_bdevs_discovered": 1, 00:25:38.734 "num_base_bdevs_operational": 1, 00:25:38.734 "base_bdevs_list": [ 00:25:38.734 { 00:25:38.734 "name": null, 00:25:38.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.734 "is_configured": false, 00:25:38.734 "data_offset": 256, 00:25:38.734 "data_size": 7936 00:25:38.734 }, 00:25:38.734 { 00:25:38.734 "name": "BaseBdev2", 00:25:38.734 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:38.734 "is_configured": true, 00:25:38.734 "data_offset": 256, 00:25:38.734 "data_size": 7936 00:25:38.734 } 00:25:38.734 ] 00:25:38.734 }' 00:25:38.734 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.734 10:21:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.351 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.610 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:39.610 "name": "raid_bdev1", 00:25:39.610 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:39.610 "strip_size_kb": 0, 00:25:39.610 "state": "online", 00:25:39.610 "raid_level": "raid1", 00:25:39.610 "superblock": true, 00:25:39.610 "num_base_bdevs": 2, 00:25:39.610 "num_base_bdevs_discovered": 1, 00:25:39.610 "num_base_bdevs_operational": 1, 00:25:39.610 "base_bdevs_list": [ 00:25:39.610 { 00:25:39.610 "name": null, 00:25:39.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.611 "is_configured": false, 00:25:39.611 "data_offset": 256, 00:25:39.611 "data_size": 7936 00:25:39.611 }, 00:25:39.611 { 00:25:39.611 "name": "BaseBdev2", 00:25:39.611 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:39.611 "is_configured": true, 00:25:39.611 "data_offset": 256, 00:25:39.611 "data_size": 7936 00:25:39.611 } 00:25:39.611 ] 00:25:39.611 }' 00:25:39.611 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:39.611 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:39.611 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:39.611 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:39.611 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:39.870 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:39.870 [2024-06-10 10:21:01.665658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:39.870 [2024-06-10 10:21:01.665691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.870 [2024-06-10 10:21:01.665703] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17497f0 00:25:39.870 [2024-06-10 10:21:01.665710] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.870 [2024-06-10 10:21:01.665852] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.870 [2024-06-10 10:21:01.665863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:39.870 [2024-06-10 10:21:01.665896] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:39.870 [2024-06-10 10:21:01.665902] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:39.870 [2024-06-10 10:21:01.665908] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:39.870 BaseBdev1 00:25:39.870 10:21:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.253 "name": "raid_bdev1", 00:25:41.253 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:41.253 "strip_size_kb": 0, 00:25:41.253 "state": "online", 00:25:41.253 "raid_level": "raid1", 00:25:41.253 "superblock": true, 00:25:41.253 "num_base_bdevs": 2, 00:25:41.253 "num_base_bdevs_discovered": 1, 00:25:41.253 "num_base_bdevs_operational": 1, 00:25:41.253 "base_bdevs_list": [ 00:25:41.253 { 00:25:41.253 "name": null, 00:25:41.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.253 "is_configured": false, 00:25:41.253 "data_offset": 256, 00:25:41.253 "data_size": 7936 00:25:41.253 }, 00:25:41.253 { 00:25:41.253 "name": "BaseBdev2", 00:25:41.253 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:41.253 "is_configured": true, 00:25:41.253 "data_offset": 256, 00:25:41.253 "data_size": 7936 00:25:41.253 } 00:25:41.253 ] 00:25:41.253 }' 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.253 10:21:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.822 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.822 "name": "raid_bdev1", 00:25:41.822 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:41.822 "strip_size_kb": 0, 00:25:41.822 "state": "online", 00:25:41.822 "raid_level": "raid1", 00:25:41.822 "superblock": true, 00:25:41.822 "num_base_bdevs": 2, 00:25:41.822 "num_base_bdevs_discovered": 1, 00:25:41.822 "num_base_bdevs_operational": 1, 00:25:41.822 "base_bdevs_list": [ 00:25:41.822 { 00:25:41.822 "name": null, 00:25:41.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.822 "is_configured": false, 00:25:41.822 "data_offset": 256, 00:25:41.823 "data_size": 7936 00:25:41.823 }, 00:25:41.823 { 00:25:41.823 "name": "BaseBdev2", 00:25:41.823 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:41.823 "is_configured": true, 00:25:41.823 "data_offset": 256, 00:25:41.823 "data_size": 7936 00:25:41.823 } 00:25:41.823 ] 00:25:41.823 }' 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:41.823 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:42.083 [2024-06-10 10:21:03.851283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:42.083 [2024-06-10 10:21:03.851375] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:42.083 [2024-06-10 10:21:03.851383] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:42.083 request: 00:25:42.083 { 00:25:42.083 "raid_bdev": "raid_bdev1", 00:25:42.083 "base_bdev": "BaseBdev1", 00:25:42.083 "method": "bdev_raid_add_base_bdev", 00:25:42.083 "req_id": 1 00:25:42.083 } 00:25:42.083 Got JSON-RPC error response 00:25:42.083 response: 00:25:42.083 { 00:25:42.083 "code": -22, 00:25:42.083 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:42.083 } 00:25:42.083 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:25:42.083 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:42.083 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:42.083 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:42.083 10:21:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.022 10:21:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.283 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.283 "name": "raid_bdev1", 00:25:43.283 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:43.283 "strip_size_kb": 0, 00:25:43.283 "state": "online", 00:25:43.283 "raid_level": "raid1", 00:25:43.283 "superblock": true, 00:25:43.283 "num_base_bdevs": 2, 00:25:43.283 "num_base_bdevs_discovered": 1, 00:25:43.283 "num_base_bdevs_operational": 1, 00:25:43.283 "base_bdevs_list": [ 00:25:43.283 { 00:25:43.283 "name": null, 00:25:43.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.283 "is_configured": false, 00:25:43.283 "data_offset": 256, 00:25:43.283 "data_size": 7936 00:25:43.283 }, 00:25:43.283 { 00:25:43.283 "name": "BaseBdev2", 00:25:43.283 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:43.283 "is_configured": true, 00:25:43.283 "data_offset": 256, 00:25:43.283 "data_size": 7936 00:25:43.283 } 00:25:43.283 ] 00:25:43.283 }' 00:25:43.283 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.283 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.853 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.113 "name": "raid_bdev1", 00:25:44.113 "uuid": "0e47f3a6-0d7b-498d-a55b-010932855cfa", 00:25:44.113 "strip_size_kb": 0, 00:25:44.113 "state": "online", 00:25:44.113 "raid_level": "raid1", 00:25:44.113 "superblock": true, 00:25:44.113 "num_base_bdevs": 2, 00:25:44.113 "num_base_bdevs_discovered": 1, 00:25:44.113 "num_base_bdevs_operational": 1, 00:25:44.113 "base_bdevs_list": [ 00:25:44.113 { 00:25:44.113 "name": null, 00:25:44.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.113 "is_configured": false, 00:25:44.113 "data_offset": 256, 00:25:44.113 "data_size": 7936 00:25:44.113 }, 00:25:44.113 { 00:25:44.113 "name": "BaseBdev2", 00:25:44.113 "uuid": "ea67a3b4-6a03-54ac-95a1-1953936bfc1d", 00:25:44.113 "is_configured": true, 00:25:44.113 "data_offset": 256, 00:25:44.113 "data_size": 7936 00:25:44.113 } 00:25:44.113 ] 00:25:44.113 }' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1127429 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1127429 ']' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1127429 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1127429 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1127429' 00:25:44.113 killing process with pid 1127429 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 1127429 00:25:44.113 Received shutdown signal, test time was about 60.000000 seconds 00:25:44.113 00:25:44.113 Latency(us) 00:25:44.113 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:44.113 =================================================================================================================== 00:25:44.113 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:44.113 [2024-06-10 10:21:05.974949] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:44.113 [2024-06-10 10:21:05.975023] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:44.113 [2024-06-10 10:21:05.975053] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:44.113 [2024-06-10 10:21:05.975059] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1748440 name raid_bdev1, state offline 00:25:44.113 10:21:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 1127429 00:25:44.373 [2024-06-10 10:21:05.990311] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:44.373 10:21:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:25:44.373 00:25:44.373 real 0m25.222s 00:25:44.373 user 0m40.005s 00:25:44.373 sys 0m2.657s 00:25:44.373 10:21:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:44.373 10:21:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:44.373 ************************************ 00:25:44.373 END TEST raid_rebuild_test_sb_md_interleaved 00:25:44.373 ************************************ 00:25:44.373 10:21:06 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:25:44.373 10:21:06 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:25:44.373 10:21:06 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1127429 ']' 00:25:44.373 10:21:06 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1127429 00:25:44.373 10:21:06 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:25:44.373 00:25:44.373 real 15m27.627s 00:25:44.373 user 26m29.213s 00:25:44.373 sys 2m15.647s 00:25:44.373 10:21:06 bdev_raid -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:44.373 10:21:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:44.373 ************************************ 00:25:44.373 END TEST bdev_raid 00:25:44.373 ************************************ 00:25:44.373 10:21:06 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:44.373 10:21:06 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:44.373 10:21:06 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:44.373 10:21:06 -- common/autotest_common.sh@10 -- # set +x 00:25:44.634 ************************************ 00:25:44.634 START TEST bdevperf_config 00:25:44.634 ************************************ 00:25:44.634 10:21:06 bdevperf_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:44.634 * Looking for test storage... 00:25:44.634 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:44.634 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:44.634 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:44.634 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:44.634 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:44.634 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:44.634 10:21:06 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:47.176 10:21:08 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-06-10 10:21:06.460214] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:47.176 [2024-06-10 10:21:06.460276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132276 ] 00:25:47.176 Using job config with 4 jobs 00:25:47.176 [2024-06-10 10:21:06.571775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:47.176 [2024-06-10 10:21:06.663907] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.176 cpumask for '\''job0'\'' is too big 00:25:47.176 cpumask for '\''job1'\'' is too big 00:25:47.176 cpumask for '\''job2'\'' is too big 00:25:47.176 cpumask for '\''job3'\'' is too big 00:25:47.176 Running I/O for 2 seconds... 00:25:47.176 00:25:47.176 Latency(us) 00:25:47.176 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:47.176 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.176 Malloc0 : 2.01 28853.28 28.18 0.00 0.00 8863.65 1581.69 13712.15 00:25:47.176 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.176 Malloc0 : 2.02 28830.83 28.16 0.00 0.00 8853.08 1569.08 12098.95 00:25:47.176 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.176 Malloc0 : 2.02 28808.61 28.13 0.00 0.00 8843.42 1562.78 10536.17 00:25:47.176 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.176 Malloc0 : 2.02 28880.90 28.20 0.00 0.00 8804.24 775.09 9124.63 00:25:47.176 =================================================================================================================== 00:25:47.176 Total : 115373.61 112.67 0.00 0.00 8841.06 775.09 13712.15' 00:25:47.176 10:21:08 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-06-10 10:21:06.460214] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:47.176 [2024-06-10 10:21:06.460276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132276 ] 00:25:47.176 Using job config with 4 jobs 00:25:47.176 [2024-06-10 10:21:06.571775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:47.177 [2024-06-10 10:21:06.663907] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.177 cpumask for '\''job0'\'' is too big 00:25:47.177 cpumask for '\''job1'\'' is too big 00:25:47.177 cpumask for '\''job2'\'' is too big 00:25:47.177 cpumask for '\''job3'\'' is too big 00:25:47.177 Running I/O for 2 seconds... 00:25:47.177 00:25:47.177 Latency(us) 00:25:47.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.01 28853.28 28.18 0.00 0.00 8863.65 1581.69 13712.15 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28830.83 28.16 0.00 0.00 8853.08 1569.08 12098.95 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28808.61 28.13 0.00 0.00 8843.42 1562.78 10536.17 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28880.90 28.20 0.00 0.00 8804.24 775.09 9124.63 00:25:47.177 =================================================================================================================== 00:25:47.177 Total : 115373.61 112.67 0.00 0.00 8841.06 775.09 13712.15' 00:25:47.177 10:21:08 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 10:21:06.460214] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:47.177 [2024-06-10 10:21:06.460276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132276 ] 00:25:47.177 Using job config with 4 jobs 00:25:47.177 [2024-06-10 10:21:06.571775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:47.177 [2024-06-10 10:21:06.663907] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.177 cpumask for '\''job0'\'' is too big 00:25:47.177 cpumask for '\''job1'\'' is too big 00:25:47.177 cpumask for '\''job2'\'' is too big 00:25:47.177 cpumask for '\''job3'\'' is too big 00:25:47.177 Running I/O for 2 seconds... 00:25:47.177 00:25:47.177 Latency(us) 00:25:47.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.01 28853.28 28.18 0.00 0.00 8863.65 1581.69 13712.15 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28830.83 28.16 0.00 0.00 8853.08 1569.08 12098.95 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28808.61 28.13 0.00 0.00 8843.42 1562.78 10536.17 00:25:47.177 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:47.177 Malloc0 : 2.02 28880.90 28.20 0.00 0.00 8804.24 775.09 9124.63 00:25:47.177 =================================================================================================================== 00:25:47.177 Total : 115373.61 112.67 0.00 0.00 8841.06 775.09 13712.15' 00:25:47.177 10:21:08 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:47.177 10:21:08 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:47.177 10:21:08 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:25:47.177 10:21:08 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:47.177 [2024-06-10 10:21:08.994633] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:47.177 [2024-06-10 10:21:08.994685] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132740 ] 00:25:47.436 [2024-06-10 10:21:09.105467] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:47.436 [2024-06-10 10:21:09.196416] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.436 cpumask for 'job0' is too big 00:25:47.436 cpumask for 'job1' is too big 00:25:47.436 cpumask for 'job2' is too big 00:25:47.436 cpumask for 'job3' is too big 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:25:49.976 Running I/O for 2 seconds... 00:25:49.976 00:25:49.976 Latency(us) 00:25:49.976 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:49.976 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:49.976 Malloc0 : 2.01 28617.35 27.95 0.00 0.00 8941.13 1613.19 13812.97 00:25:49.976 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:49.976 Malloc0 : 2.01 28595.12 27.92 0.00 0.00 8930.84 1594.29 12149.37 00:25:49.976 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:49.976 Malloc0 : 2.02 28636.01 27.96 0.00 0.00 8901.38 1575.38 10637.00 00:25:49.976 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:49.976 Malloc0 : 2.02 28613.98 27.94 0.00 0.00 8891.64 1569.08 9376.69 00:25:49.976 =================================================================================================================== 00:25:49.976 Total : 114462.45 111.78 0.00 0.00 8916.20 1569.08 13812.97' 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:49.976 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:49.976 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:49.976 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:49.976 10:21:11 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-06-10 10:21:11.547854] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:52.519 [2024-06-10 10:21:11.547907] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133421 ] 00:25:52.519 Using job config with 3 jobs 00:25:52.519 [2024-06-10 10:21:11.652500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.519 [2024-06-10 10:21:11.732285] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.519 cpumask for '\''job0'\'' is too big 00:25:52.519 cpumask for '\''job1'\'' is too big 00:25:52.519 cpumask for '\''job2'\'' is too big 00:25:52.519 Running I/O for 2 seconds... 00:25:52.519 00:25:52.519 Latency(us) 00:25:52.519 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38674.08 37.77 0.00 0.00 6623.37 1543.88 9729.58 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38644.02 37.74 0.00 0.00 6615.96 1537.58 8166.79 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.02 38614.09 37.71 0.00 0.00 6608.93 1506.07 6856.07 00:25:52.519 =================================================================================================================== 00:25:52.519 Total : 115932.19 113.22 0.00 0.00 6616.09 1506.07 9729.58' 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-06-10 10:21:11.547854] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:52.519 [2024-06-10 10:21:11.547907] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133421 ] 00:25:52.519 Using job config with 3 jobs 00:25:52.519 [2024-06-10 10:21:11.652500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.519 [2024-06-10 10:21:11.732285] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.519 cpumask for '\''job0'\'' is too big 00:25:52.519 cpumask for '\''job1'\'' is too big 00:25:52.519 cpumask for '\''job2'\'' is too big 00:25:52.519 Running I/O for 2 seconds... 00:25:52.519 00:25:52.519 Latency(us) 00:25:52.519 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38674.08 37.77 0.00 0.00 6623.37 1543.88 9729.58 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38644.02 37.74 0.00 0.00 6615.96 1537.58 8166.79 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.02 38614.09 37.71 0.00 0.00 6608.93 1506.07 6856.07 00:25:52.519 =================================================================================================================== 00:25:52.519 Total : 115932.19 113.22 0.00 0.00 6616.09 1506.07 9729.58' 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 10:21:11.547854] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:52.519 [2024-06-10 10:21:11.547907] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133421 ] 00:25:52.519 Using job config with 3 jobs 00:25:52.519 [2024-06-10 10:21:11.652500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.519 [2024-06-10 10:21:11.732285] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.519 cpumask for '\''job0'\'' is too big 00:25:52.519 cpumask for '\''job1'\'' is too big 00:25:52.519 cpumask for '\''job2'\'' is too big 00:25:52.519 Running I/O for 2 seconds... 00:25:52.519 00:25:52.519 Latency(us) 00:25:52.519 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38674.08 37.77 0.00 0.00 6623.37 1543.88 9729.58 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.01 38644.02 37.74 0.00 0.00 6615.96 1537.58 8166.79 00:25:52.519 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:52.519 Malloc0 : 2.02 38614.09 37.71 0.00 0.00 6608.93 1506.07 6856.07 00:25:52.519 =================================================================================================================== 00:25:52.519 Total : 115932.19 113.22 0.00 0.00 6616.09 1506.07 9729.58' 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:25:52.519 10:21:13 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:52.519 10:21:14 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:25:52.519 10:21:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:52.519 10:21:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:52.520 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:52.520 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:52.520 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:52.520 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:52.520 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:52.520 10:21:14 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:55.064 10:21:16 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-06-10 10:21:14.088751] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:55.064 [2024-06-10 10:21:14.088809] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133980 ] 00:25:55.064 Using job config with 4 jobs 00:25:55.064 [2024-06-10 10:21:14.199309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.064 [2024-06-10 10:21:14.289509] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.064 cpumask for '\''job0'\'' is too big 00:25:55.064 cpumask for '\''job1'\'' is too big 00:25:55.064 cpumask for '\''job2'\'' is too big 00:25:55.064 cpumask for '\''job3'\'' is too big 00:25:55.064 Running I/O for 2 seconds... 00:25:55.064 00:25:55.064 Latency(us) 00:25:55.064 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:55.064 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.064 Malloc0 : 2.03 14344.85 14.01 0.00 0.00 17852.20 3276.80 27827.59 00:25:55.064 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.064 Malloc1 : 2.04 14333.52 14.00 0.00 0.00 17851.48 3906.95 27827.59 00:25:55.064 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.064 Malloc0 : 2.04 14322.49 13.99 0.00 0.00 17810.97 3213.78 24399.56 00:25:55.064 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.064 Malloc1 : 2.04 14311.27 13.98 0.00 0.00 17810.27 3831.34 24399.56 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14300.30 13.97 0.00 0.00 17771.64 3251.59 21173.17 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14289.17 13.95 0.00 0.00 17771.52 3856.54 21173.17 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14278.18 13.94 0.00 0.00 17731.55 3201.18 18148.43 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.05 14267.07 13.93 0.00 0.00 17731.54 3856.54 18148.43 00:25:55.065 =================================================================================================================== 00:25:55.065 Total : 114446.85 111.76 0.00 0.00 17791.39 3201.18 27827.59' 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-06-10 10:21:14.088751] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:55.065 [2024-06-10 10:21:14.088809] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133980 ] 00:25:55.065 Using job config with 4 jobs 00:25:55.065 [2024-06-10 10:21:14.199309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.065 [2024-06-10 10:21:14.289509] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.065 cpumask for '\''job0'\'' is too big 00:25:55.065 cpumask for '\''job1'\'' is too big 00:25:55.065 cpumask for '\''job2'\'' is too big 00:25:55.065 cpumask for '\''job3'\'' is too big 00:25:55.065 Running I/O for 2 seconds... 00:25:55.065 00:25:55.065 Latency(us) 00:25:55.065 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.03 14344.85 14.01 0.00 0.00 17852.20 3276.80 27827.59 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14333.52 14.00 0.00 0.00 17851.48 3906.95 27827.59 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14322.49 13.99 0.00 0.00 17810.97 3213.78 24399.56 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14311.27 13.98 0.00 0.00 17810.27 3831.34 24399.56 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14300.30 13.97 0.00 0.00 17771.64 3251.59 21173.17 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14289.17 13.95 0.00 0.00 17771.52 3856.54 21173.17 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14278.18 13.94 0.00 0.00 17731.55 3201.18 18148.43 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.05 14267.07 13.93 0.00 0.00 17731.54 3856.54 18148.43 00:25:55.065 =================================================================================================================== 00:25:55.065 Total : 114446.85 111.76 0.00 0.00 17791.39 3201.18 27827.59' 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 10:21:14.088751] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:55.065 [2024-06-10 10:21:14.088809] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133980 ] 00:25:55.065 Using job config with 4 jobs 00:25:55.065 [2024-06-10 10:21:14.199309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.065 [2024-06-10 10:21:14.289509] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.065 cpumask for '\''job0'\'' is too big 00:25:55.065 cpumask for '\''job1'\'' is too big 00:25:55.065 cpumask for '\''job2'\'' is too big 00:25:55.065 cpumask for '\''job3'\'' is too big 00:25:55.065 Running I/O for 2 seconds... 00:25:55.065 00:25:55.065 Latency(us) 00:25:55.065 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.03 14344.85 14.01 0.00 0.00 17852.20 3276.80 27827.59 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14333.52 14.00 0.00 0.00 17851.48 3906.95 27827.59 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14322.49 13.99 0.00 0.00 17810.97 3213.78 24399.56 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14311.27 13.98 0.00 0.00 17810.27 3831.34 24399.56 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14300.30 13.97 0.00 0.00 17771.64 3251.59 21173.17 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.04 14289.17 13.95 0.00 0.00 17771.52 3856.54 21173.17 00:25:55.065 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc0 : 2.04 14278.18 13.94 0.00 0.00 17731.55 3201.18 18148.43 00:25:55.065 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:55.065 Malloc1 : 2.05 14267.07 13.93 0.00 0.00 17731.54 3856.54 18148.43 00:25:55.065 =================================================================================================================== 00:25:55.065 Total : 114446.85 111.76 0.00 0.00 17791.39 3201.18 27827.59' 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:55.065 10:21:16 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:25:55.065 00:25:55.065 real 0m10.341s 00:25:55.065 user 0m9.321s 00:25:55.065 sys 0m0.850s 00:25:55.065 10:21:16 bdevperf_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:55.065 10:21:16 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:25:55.065 ************************************ 00:25:55.065 END TEST bdevperf_config 00:25:55.065 ************************************ 00:25:55.065 10:21:16 -- spdk/autotest.sh@192 -- # uname -s 00:25:55.065 10:21:16 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:25:55.065 10:21:16 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:55.065 10:21:16 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:55.065 10:21:16 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:55.065 10:21:16 -- common/autotest_common.sh@10 -- # set +x 00:25:55.065 ************************************ 00:25:55.065 START TEST reactor_set_interrupt 00:25:55.065 ************************************ 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:55.065 * Looking for test storage... 00:25:55.065 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:55.065 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:25:55.065 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:25:55.066 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:25:55.066 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:25:55.066 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:25:55.066 10:21:16 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:25:55.066 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:25:55.066 10:21:16 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:25:55.066 #define SPDK_CONFIG_H 00:25:55.066 #define SPDK_CONFIG_APPS 1 00:25:55.066 #define SPDK_CONFIG_ARCH native 00:25:55.066 #undef SPDK_CONFIG_ASAN 00:25:55.066 #undef SPDK_CONFIG_AVAHI 00:25:55.066 #undef SPDK_CONFIG_CET 00:25:55.066 #define SPDK_CONFIG_COVERAGE 1 00:25:55.066 #define SPDK_CONFIG_CROSS_PREFIX 00:25:55.066 #define SPDK_CONFIG_CRYPTO 1 00:25:55.066 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:25:55.066 #undef SPDK_CONFIG_CUSTOMOCF 00:25:55.066 #undef SPDK_CONFIG_DAOS 00:25:55.066 #define SPDK_CONFIG_DAOS_DIR 00:25:55.066 #define SPDK_CONFIG_DEBUG 1 00:25:55.066 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:25:55.066 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:55.066 #define SPDK_CONFIG_DPDK_INC_DIR 00:25:55.066 #define SPDK_CONFIG_DPDK_LIB_DIR 00:25:55.066 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:25:55.066 #undef SPDK_CONFIG_DPDK_UADK 00:25:55.066 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:55.066 #define SPDK_CONFIG_EXAMPLES 1 00:25:55.067 #undef SPDK_CONFIG_FC 00:25:55.067 #define SPDK_CONFIG_FC_PATH 00:25:55.067 #define SPDK_CONFIG_FIO_PLUGIN 1 00:25:55.067 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:25:55.067 #undef SPDK_CONFIG_FUSE 00:25:55.067 #undef SPDK_CONFIG_FUZZER 00:25:55.067 #define SPDK_CONFIG_FUZZER_LIB 00:25:55.067 #undef SPDK_CONFIG_GOLANG 00:25:55.067 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:25:55.067 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:25:55.067 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:25:55.067 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:25:55.067 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:25:55.067 #undef SPDK_CONFIG_HAVE_LIBBSD 00:25:55.067 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:25:55.067 #define SPDK_CONFIG_IDXD 1 00:25:55.067 #define SPDK_CONFIG_IDXD_KERNEL 1 00:25:55.067 #define SPDK_CONFIG_IPSEC_MB 1 00:25:55.067 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:55.067 #define SPDK_CONFIG_ISAL 1 00:25:55.067 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:25:55.067 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:25:55.067 #define SPDK_CONFIG_LIBDIR 00:25:55.067 #undef SPDK_CONFIG_LTO 00:25:55.067 #define SPDK_CONFIG_MAX_LCORES 00:25:55.067 #define SPDK_CONFIG_NVME_CUSE 1 00:25:55.067 #undef SPDK_CONFIG_OCF 00:25:55.067 #define SPDK_CONFIG_OCF_PATH 00:25:55.067 #define SPDK_CONFIG_OPENSSL_PATH 00:25:55.067 #undef SPDK_CONFIG_PGO_CAPTURE 00:25:55.067 #define SPDK_CONFIG_PGO_DIR 00:25:55.067 #undef SPDK_CONFIG_PGO_USE 00:25:55.067 #define SPDK_CONFIG_PREFIX /usr/local 00:25:55.067 #undef SPDK_CONFIG_RAID5F 00:25:55.067 #undef SPDK_CONFIG_RBD 00:25:55.067 #define SPDK_CONFIG_RDMA 1 00:25:55.067 #define SPDK_CONFIG_RDMA_PROV verbs 00:25:55.067 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:25:55.067 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:25:55.067 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:25:55.067 #define SPDK_CONFIG_SHARED 1 00:25:55.067 #undef SPDK_CONFIG_SMA 00:25:55.067 #define SPDK_CONFIG_TESTS 1 00:25:55.067 #undef SPDK_CONFIG_TSAN 00:25:55.067 #define SPDK_CONFIG_UBLK 1 00:25:55.067 #define SPDK_CONFIG_UBSAN 1 00:25:55.067 #undef SPDK_CONFIG_UNIT_TESTS 00:25:55.067 #undef SPDK_CONFIG_URING 00:25:55.067 #define SPDK_CONFIG_URING_PATH 00:25:55.067 #undef SPDK_CONFIG_URING_ZNS 00:25:55.067 #undef SPDK_CONFIG_USDT 00:25:55.067 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:25:55.067 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:25:55.067 #undef SPDK_CONFIG_VFIO_USER 00:25:55.067 #define SPDK_CONFIG_VFIO_USER_DIR 00:25:55.067 #define SPDK_CONFIG_VHOST 1 00:25:55.067 #define SPDK_CONFIG_VIRTIO 1 00:25:55.067 #undef SPDK_CONFIG_VTUNE 00:25:55.067 #define SPDK_CONFIG_VTUNE_DIR 00:25:55.067 #define SPDK_CONFIG_WERROR 1 00:25:55.067 #define SPDK_CONFIG_WPDK_DIR 00:25:55.067 #undef SPDK_CONFIG_XNVME 00:25:55.067 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:55.067 10:21:16 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:55.067 10:21:16 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:55.067 10:21:16 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:55.067 10:21:16 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:55.067 10:21:16 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:25:55.067 10:21:16 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:25:55.067 10:21:16 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:25:55.067 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:25:55.068 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1134599 ]] 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1134599 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.gIXGXf 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.gIXGXf/tests/interrupt /tmp/spdk.gIXGXf 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:25:55.069 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=957218816 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4327211008 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=118525132800 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376284672 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=10851151872 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64683429888 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688140288 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=25865334784 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9924608 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64687251456 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=892928 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:25:55.330 * Looking for test storage... 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=118525132800 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:25:55.330 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=13065744384 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.331 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # set -o errtrace 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # true 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@1688 -- # xtrace_fd 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1134646 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1134646 /var/tmp/spdk.sock 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 1134646 ']' 00:25:55.331 10:21:16 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:55.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:55.331 10:21:16 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:55.331 [2024-06-10 10:21:16.999486] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:55.331 [2024-06-10 10:21:16.999545] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1134646 ] 00:25:55.331 [2024-06-10 10:21:17.091476] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:55.331 [2024-06-10 10:21:17.160420] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:55.331 [2024-06-10 10:21:17.160546] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:55.331 [2024-06-10 10:21:17.160549] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.591 [2024-06-10 10:21:17.210311] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:56.162 10:21:17 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:56.162 10:21:17 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:25:56.162 10:21:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:25:56.162 10:21:17 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:56.422 Malloc0 00:25:56.422 Malloc1 00:25:56.422 Malloc2 00:25:56.422 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:25:56.422 10:21:18 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:25:56.422 10:21:18 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:25:56.422 10:21:18 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:25:56.422 5000+0 records in 00:25:56.422 5000+0 records out 00:25:56.422 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0174899 s, 585 MB/s 00:25:56.422 10:21:18 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:25:56.682 AIO0 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1134646 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1134646 without_thd 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1134646 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:56.682 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:25:56.943 spdk_thread ids are 1 on reactor0. 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1134646 0 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1134646 0 idle 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:56.943 10:21:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134646 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.30 reactor_0' 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134646 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.30 reactor_0 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1134646 1 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1134646 1 idle 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:57.204 10:21:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134649 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.00 reactor_1' 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134649 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.00 reactor_1 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1134646 2 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1134646 2 idle 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:57.204 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134650 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.00 reactor_2' 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134650 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.00 reactor_2 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:57.464 10:21:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:25:57.465 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:25:57.725 [2024-06-10 10:21:19.409335] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:57.725 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:25:57.725 [2024-06-10 10:21:19.589028] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:25:57.985 [2024-06-10 10:21:19.591378] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:25:57.985 [2024-06-10 10:21:19.768896] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:25:57.985 [2024-06-10 10:21:19.769192] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1134646 0 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1134646 0 busy 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:57.985 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134646 root 20 0 128.2g 34816 23552 R 93.8 0.0 0:00.67 reactor_0' 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134646 root 20 0 128.2g 34816 23552 R 93.8 0.0 0:00.67 reactor_0 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1134646 2 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1134646 2 busy 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:58.244 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:58.245 10:21:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134650 root 20 0 128.2g 34816 23552 R 99.9 0.0 0:00.36 reactor_2' 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134650 root 20 0 128.2g 34816 23552 R 99.9 0.0 0:00.36 reactor_2 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:25:58.504 [2024-06-10 10:21:20.328901] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:25:58.504 [2024-06-10 10:21:20.329003] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1134646 2 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1134646 2 idle 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:58.504 10:21:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:58.505 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134650 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.55 reactor_2' 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134650 root 20 0 128.2g 34816 23552 S 0.0 0.0 0:00.55 reactor_2 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:58.765 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:25:59.025 [2024-06-10 10:21:20.692897] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:25:59.025 [2024-06-10 10:21:20.693195] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:59.025 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:25:59.025 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:25:59.025 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:25:59.025 [2024-06-10 10:21:20.885291] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1134646 0 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1134646 0 idle 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1134646 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1134646 -w 256 00:25:59.285 10:21:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1134646 root 20 0 128.2g 34816 23552 S 6.7 0.0 0:01.41 reactor_0' 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1134646 root 20 0 128.2g 34816 23552 S 6.7 0.0 0:01.41 reactor_0 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:25:59.285 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1134646 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 1134646 ']' 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 1134646 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1134646 00:25:59.285 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:59.286 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:59.286 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1134646' 00:25:59.286 killing process with pid 1134646 00:25:59.286 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 1134646 00:25:59.286 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 1134646 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1135339 00:25:59.545 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:59.546 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:59.546 10:21:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1135339 /var/tmp/spdk.sock 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 1135339 ']' 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:59.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:59.546 10:21:21 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:59.546 [2024-06-10 10:21:21.321622] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:25:59.546 [2024-06-10 10:21:21.321673] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1135339 ] 00:25:59.546 [2024-06-10 10:21:21.409392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:59.805 [2024-06-10 10:21:21.474970] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:59.805 [2024-06-10 10:21:21.475098] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:59.805 [2024-06-10 10:21:21.475101] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:59.805 [2024-06-10 10:21:21.524708] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:00.374 10:21:22 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:00.374 10:21:22 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:26:00.374 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:26:00.374 10:21:22 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:00.634 Malloc0 00:26:00.634 Malloc1 00:26:00.634 Malloc2 00:26:00.634 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:26:00.634 10:21:22 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:00.634 10:21:22 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:00.634 10:21:22 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:00.634 5000+0 records in 00:26:00.634 5000+0 records out 00:26:00.634 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0167976 s, 610 MB/s 00:26:00.634 10:21:22 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:00.893 AIO0 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1135339 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1135339 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1135339 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:00.893 10:21:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:01.153 10:21:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:01.153 spdk_thread ids are 1 on reactor0. 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1135339 0 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1135339 0 idle 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:01.153 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135339 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.28 reactor_0' 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135339 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.28 reactor_0 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1135339 1 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1135339 1 idle 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:01.413 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:01.679 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135346 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1' 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135346 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1135339 2 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1135339 2 idle 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135347 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2' 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135347 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:01.680 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:26:01.983 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:01.983 [2024-06-10 10:21:23.731589] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:01.983 [2024-06-10 10:21:23.731803] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:26:01.984 [2024-06-10 10:21:23.732089] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:01.984 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:02.274 [2024-06-10 10:21:23.931922] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:02.274 [2024-06-10 10:21:23.932256] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1135339 0 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1135339 0 busy 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:02.274 10:21:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135339 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.68 reactor_0' 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135339 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.68 reactor_0 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1135339 2 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1135339 2 busy 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:02.274 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135347 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.35 reactor_2' 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135347 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.35 reactor_2 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:02.535 10:21:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:02.536 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:02.797 [2024-06-10 10:21:24.481357] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:02.797 [2024-06-10 10:21:24.481501] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1135339 2 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1135339 2 idle 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:02.797 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135347 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.54 reactor_2' 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135347 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.54 reactor_2 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:03.057 [2024-06-10 10:21:24.862302] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:03.057 [2024-06-10 10:21:24.862623] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:26:03.057 [2024-06-10 10:21:24.862647] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1135339 0 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1135339 0 idle 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1135339 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1135339 -w 256 00:26:03.057 10:21:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1135339 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:01.42 reactor_0' 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1135339 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:01.42 reactor_0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:26:03.317 10:21:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1135339 00:26:03.317 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 1135339 ']' 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 1135339 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1135339 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1135339' 00:26:03.318 killing process with pid 1135339 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 1135339 00:26:03.318 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 1135339 00:26:03.577 10:21:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:26:03.577 10:21:25 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:03.577 00:26:03.577 real 0m8.592s 00:26:03.577 user 0m7.984s 00:26:03.577 sys 0m1.559s 00:26:03.577 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:03.577 10:21:25 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:03.577 ************************************ 00:26:03.577 END TEST reactor_set_interrupt 00:26:03.577 ************************************ 00:26:03.578 10:21:25 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:03.578 10:21:25 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:26:03.578 10:21:25 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:03.578 10:21:25 -- common/autotest_common.sh@10 -- # set +x 00:26:03.578 ************************************ 00:26:03.578 START TEST reap_unregistered_poller 00:26:03.578 ************************************ 00:26:03.578 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:03.578 * Looking for test storage... 00:26:03.841 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:03.841 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:03.841 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:03.841 10:21:25 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:03.841 10:21:25 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:03.841 10:21:25 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:03.841 10:21:25 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:03.841 10:21:25 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:03.842 10:21:25 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:03.842 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:03.842 10:21:25 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:03.842 #define SPDK_CONFIG_H 00:26:03.842 #define SPDK_CONFIG_APPS 1 00:26:03.842 #define SPDK_CONFIG_ARCH native 00:26:03.842 #undef SPDK_CONFIG_ASAN 00:26:03.842 #undef SPDK_CONFIG_AVAHI 00:26:03.842 #undef SPDK_CONFIG_CET 00:26:03.842 #define SPDK_CONFIG_COVERAGE 1 00:26:03.842 #define SPDK_CONFIG_CROSS_PREFIX 00:26:03.842 #define SPDK_CONFIG_CRYPTO 1 00:26:03.842 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:03.842 #undef SPDK_CONFIG_CUSTOMOCF 00:26:03.842 #undef SPDK_CONFIG_DAOS 00:26:03.842 #define SPDK_CONFIG_DAOS_DIR 00:26:03.842 #define SPDK_CONFIG_DEBUG 1 00:26:03.842 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:03.842 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:03.842 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:03.842 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:03.842 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:03.842 #undef SPDK_CONFIG_DPDK_UADK 00:26:03.842 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:03.842 #define SPDK_CONFIG_EXAMPLES 1 00:26:03.842 #undef SPDK_CONFIG_FC 00:26:03.842 #define SPDK_CONFIG_FC_PATH 00:26:03.842 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:03.842 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:03.842 #undef SPDK_CONFIG_FUSE 00:26:03.842 #undef SPDK_CONFIG_FUZZER 00:26:03.842 #define SPDK_CONFIG_FUZZER_LIB 00:26:03.842 #undef SPDK_CONFIG_GOLANG 00:26:03.842 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:03.842 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:03.842 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:03.842 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:26:03.842 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:03.842 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:03.842 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:03.842 #define SPDK_CONFIG_IDXD 1 00:26:03.842 #define SPDK_CONFIG_IDXD_KERNEL 1 00:26:03.842 #define SPDK_CONFIG_IPSEC_MB 1 00:26:03.842 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:03.843 #define SPDK_CONFIG_ISAL 1 00:26:03.843 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:03.843 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:03.843 #define SPDK_CONFIG_LIBDIR 00:26:03.843 #undef SPDK_CONFIG_LTO 00:26:03.843 #define SPDK_CONFIG_MAX_LCORES 00:26:03.843 #define SPDK_CONFIG_NVME_CUSE 1 00:26:03.843 #undef SPDK_CONFIG_OCF 00:26:03.843 #define SPDK_CONFIG_OCF_PATH 00:26:03.843 #define SPDK_CONFIG_OPENSSL_PATH 00:26:03.843 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:03.843 #define SPDK_CONFIG_PGO_DIR 00:26:03.843 #undef SPDK_CONFIG_PGO_USE 00:26:03.843 #define SPDK_CONFIG_PREFIX /usr/local 00:26:03.843 #undef SPDK_CONFIG_RAID5F 00:26:03.843 #undef SPDK_CONFIG_RBD 00:26:03.843 #define SPDK_CONFIG_RDMA 1 00:26:03.843 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:03.843 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:03.843 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:03.843 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:03.843 #define SPDK_CONFIG_SHARED 1 00:26:03.843 #undef SPDK_CONFIG_SMA 00:26:03.843 #define SPDK_CONFIG_TESTS 1 00:26:03.843 #undef SPDK_CONFIG_TSAN 00:26:03.843 #define SPDK_CONFIG_UBLK 1 00:26:03.843 #define SPDK_CONFIG_UBSAN 1 00:26:03.843 #undef SPDK_CONFIG_UNIT_TESTS 00:26:03.843 #undef SPDK_CONFIG_URING 00:26:03.843 #define SPDK_CONFIG_URING_PATH 00:26:03.843 #undef SPDK_CONFIG_URING_ZNS 00:26:03.843 #undef SPDK_CONFIG_USDT 00:26:03.843 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:03.843 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:03.843 #undef SPDK_CONFIG_VFIO_USER 00:26:03.843 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:03.843 #define SPDK_CONFIG_VHOST 1 00:26:03.843 #define SPDK_CONFIG_VIRTIO 1 00:26:03.843 #undef SPDK_CONFIG_VTUNE 00:26:03.843 #define SPDK_CONFIG_VTUNE_DIR 00:26:03.843 #define SPDK_CONFIG_WERROR 1 00:26:03.843 #define SPDK_CONFIG_WPDK_DIR 00:26:03.843 #undef SPDK_CONFIG_XNVME 00:26:03.843 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:03.843 10:21:25 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:03.843 10:21:25 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:03.843 10:21:25 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:03.843 10:21:25 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:03.843 10:21:25 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:26:03.843 10:21:25 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:03.843 10:21:25 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:03.843 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:03.844 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1136102 ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1136102 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.Ua1sfn 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.Ua1sfn/tests/interrupt /tmp/spdk.Ua1sfn 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=957218816 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4327211008 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=118524956672 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376284672 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=10851328000 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64683429888 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688140288 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=25865334784 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9924608 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64687251456 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=892928 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:26:03.845 * Looking for test storage... 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=118524956672 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=13065920512 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.845 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:26:03.845 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # set -o errtrace 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # true 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@1688 -- # xtrace_fd 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1136240 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1136240 /var/tmp/spdk.sock 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@830 -- # '[' -z 1136240 ']' 00:26:03.846 10:21:25 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:03.846 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:03.846 10:21:25 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:03.846 [2024-06-10 10:21:25.673632] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:03.846 [2024-06-10 10:21:25.673703] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136240 ] 00:26:04.106 [2024-06-10 10:21:25.766374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:04.106 [2024-06-10 10:21:25.861134] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:04.106 [2024-06-10 10:21:25.861263] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:04.106 [2024-06-10 10:21:25.861266] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:04.106 [2024-06-10 10:21:25.932621] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:04.677 10:21:26 reap_unregistered_poller -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:04.677 10:21:26 reap_unregistered_poller -- common/autotest_common.sh@863 -- # return 0 00:26:04.938 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:26:04.938 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:26:04.938 10:21:26 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:04.939 10:21:26 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:04.939 10:21:26 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:26:04.939 "name": "app_thread", 00:26:04.939 "id": 1, 00:26:04.939 "active_pollers": [], 00:26:04.939 "timed_pollers": [ 00:26:04.939 { 00:26:04.939 "name": "rpc_subsystem_poll_servers", 00:26:04.939 "id": 1, 00:26:04.939 "state": "waiting", 00:26:04.939 "run_count": 0, 00:26:04.939 "busy_count": 0, 00:26:04.939 "period_ticks": 10400000 00:26:04.939 } 00:26:04.939 ], 00:26:04.939 "paused_pollers": [] 00:26:04.939 }' 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:04.939 5000+0 records in 00:26:04.939 5000+0 records out 00:26:04.939 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0169956 s, 603 MB/s 00:26:04.939 10:21:26 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:05.199 AIO0 00:26:05.199 10:21:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:05.460 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:26:05.460 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:26:05.460 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:26:05.460 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:05.460 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:05.460 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:05.460 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:26:05.460 "name": "app_thread", 00:26:05.460 "id": 1, 00:26:05.460 "active_pollers": [], 00:26:05.460 "timed_pollers": [ 00:26:05.460 { 00:26:05.460 "name": "rpc_subsystem_poll_servers", 00:26:05.460 "id": 1, 00:26:05.460 "state": "waiting", 00:26:05.460 "run_count": 0, 00:26:05.460 "busy_count": 0, 00:26:05.460 "period_ticks": 10400000 00:26:05.460 } 00:26:05.460 ], 00:26:05.460 "paused_pollers": [] 00:26:05.460 }' 00:26:05.460 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@949 -- # '[' -z 1136240 ']' 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@953 -- # kill -0 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@954 -- # uname 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1136240' 00:26:05.721 killing process with pid 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@968 -- # kill 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@973 -- # wait 1136240 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:26:05.721 10:21:27 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:05.721 00:26:05.721 real 0m2.234s 00:26:05.721 user 0m1.356s 00:26:05.721 sys 0m0.587s 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:05.721 10:21:27 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:05.721 ************************************ 00:26:05.721 END TEST reap_unregistered_poller 00:26:05.721 ************************************ 00:26:05.983 10:21:27 -- spdk/autotest.sh@198 -- # uname -s 00:26:05.983 10:21:27 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:26:05.983 10:21:27 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:26:05.983 10:21:27 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:26:05.983 10:21:27 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@260 -- # timing_exit lib 00:26:05.983 10:21:27 -- common/autotest_common.sh@729 -- # xtrace_disable 00:26:05.983 10:21:27 -- common/autotest_common.sh@10 -- # set +x 00:26:05.983 10:21:27 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:26:05.983 10:21:27 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:05.983 10:21:27 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:05.983 10:21:27 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:05.983 10:21:27 -- common/autotest_common.sh@10 -- # set +x 00:26:05.983 ************************************ 00:26:05.983 START TEST compress_compdev 00:26:05.983 ************************************ 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:05.983 * Looking for test storage... 00:26:05.983 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:05.983 10:21:27 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:05.983 10:21:27 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:05.983 10:21:27 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:05.983 10:21:27 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:05.983 10:21:27 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:05.983 10:21:27 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:05.983 10:21:27 compress_compdev -- paths/export.sh@5 -- # export PATH 00:26:05.983 10:21:27 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:05.983 10:21:27 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1136791 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1136791 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1136791 ']' 00:26:05.983 10:21:27 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:05.983 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:05.983 10:21:27 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:06.245 [2024-06-10 10:21:27.883453] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:06.245 [2024-06-10 10:21:27.883518] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136791 ] 00:26:06.245 [2024-06-10 10:21:27.957307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:06.245 [2024-06-10 10:21:28.029468] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:06.245 [2024-06-10 10:21:28.029474] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:06.815 [2024-06-10 10:21:28.422369] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:07.075 10:21:28 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:07.075 10:21:28 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:26:07.075 10:21:28 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:26:07.075 10:21:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:07.075 10:21:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:10.425 [2024-06-10 10:21:31.755986] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2257570 PMD being used: compress_qat 00:26:10.425 10:21:31 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:10.425 10:21:31 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:10.425 [ 00:26:10.425 { 00:26:10.425 "name": "Nvme0n1", 00:26:10.425 "aliases": [ 00:26:10.425 "f6a94a88-9718-482f-b456-a881e5c5a605" 00:26:10.425 ], 00:26:10.425 "product_name": "NVMe disk", 00:26:10.425 "block_size": 512, 00:26:10.425 "num_blocks": 3907029168, 00:26:10.425 "uuid": "f6a94a88-9718-482f-b456-a881e5c5a605", 00:26:10.425 "assigned_rate_limits": { 00:26:10.425 "rw_ios_per_sec": 0, 00:26:10.425 "rw_mbytes_per_sec": 0, 00:26:10.425 "r_mbytes_per_sec": 0, 00:26:10.425 "w_mbytes_per_sec": 0 00:26:10.425 }, 00:26:10.425 "claimed": false, 00:26:10.425 "zoned": false, 00:26:10.425 "supported_io_types": { 00:26:10.425 "read": true, 00:26:10.425 "write": true, 00:26:10.425 "unmap": true, 00:26:10.425 "write_zeroes": true, 00:26:10.425 "flush": true, 00:26:10.425 "reset": true, 00:26:10.425 "compare": false, 00:26:10.425 "compare_and_write": false, 00:26:10.425 "abort": true, 00:26:10.425 "nvme_admin": true, 00:26:10.425 "nvme_io": true 00:26:10.425 }, 00:26:10.425 "driver_specific": { 00:26:10.425 "nvme": [ 00:26:10.425 { 00:26:10.425 "pci_address": "0000:65:00.0", 00:26:10.425 "trid": { 00:26:10.425 "trtype": "PCIe", 00:26:10.425 "traddr": "0000:65:00.0" 00:26:10.425 }, 00:26:10.425 "ctrlr_data": { 00:26:10.425 "cntlid": 0, 00:26:10.425 "vendor_id": "0x8086", 00:26:10.425 "model_number": "INTEL SSDPE2KX020T8", 00:26:10.425 "serial_number": "PHLJ9512038S2P0BGN", 00:26:10.425 "firmware_revision": "VDV10184", 00:26:10.425 "oacs": { 00:26:10.425 "security": 0, 00:26:10.425 "format": 1, 00:26:10.425 "firmware": 1, 00:26:10.425 "ns_manage": 1 00:26:10.425 }, 00:26:10.425 "multi_ctrlr": false, 00:26:10.425 "ana_reporting": false 00:26:10.425 }, 00:26:10.425 "vs": { 00:26:10.425 "nvme_version": "1.2" 00:26:10.425 }, 00:26:10.425 "ns_data": { 00:26:10.425 "id": 1, 00:26:10.425 "can_share": false 00:26:10.425 } 00:26:10.425 } 00:26:10.425 ], 00:26:10.425 "mp_policy": "active_passive" 00:26:10.425 } 00:26:10.425 } 00:26:10.425 ] 00:26:10.425 10:21:32 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:10.425 10:21:32 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:10.685 [2024-06-10 10:21:32.344156] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22584a0 PMD being used: compress_qat 00:26:11.625 21c025d9-b588-4686-874a-5e180161c630 00:26:11.625 10:21:33 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:11.886 019bb72f-7b83-4bb8-a315-e0fc20da2b40 00:26:11.886 10:21:33 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:11.886 10:21:33 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:12.147 10:21:33 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:12.147 [ 00:26:12.147 { 00:26:12.147 "name": "019bb72f-7b83-4bb8-a315-e0fc20da2b40", 00:26:12.147 "aliases": [ 00:26:12.147 "lvs0/lv0" 00:26:12.147 ], 00:26:12.147 "product_name": "Logical Volume", 00:26:12.147 "block_size": 512, 00:26:12.147 "num_blocks": 204800, 00:26:12.147 "uuid": "019bb72f-7b83-4bb8-a315-e0fc20da2b40", 00:26:12.147 "assigned_rate_limits": { 00:26:12.147 "rw_ios_per_sec": 0, 00:26:12.147 "rw_mbytes_per_sec": 0, 00:26:12.147 "r_mbytes_per_sec": 0, 00:26:12.147 "w_mbytes_per_sec": 0 00:26:12.147 }, 00:26:12.147 "claimed": false, 00:26:12.147 "zoned": false, 00:26:12.147 "supported_io_types": { 00:26:12.147 "read": true, 00:26:12.147 "write": true, 00:26:12.147 "unmap": true, 00:26:12.147 "write_zeroes": true, 00:26:12.147 "flush": false, 00:26:12.147 "reset": true, 00:26:12.147 "compare": false, 00:26:12.147 "compare_and_write": false, 00:26:12.147 "abort": false, 00:26:12.147 "nvme_admin": false, 00:26:12.147 "nvme_io": false 00:26:12.147 }, 00:26:12.147 "driver_specific": { 00:26:12.147 "lvol": { 00:26:12.147 "lvol_store_uuid": "21c025d9-b588-4686-874a-5e180161c630", 00:26:12.147 "base_bdev": "Nvme0n1", 00:26:12.147 "thin_provision": true, 00:26:12.147 "num_allocated_clusters": 0, 00:26:12.147 "snapshot": false, 00:26:12.147 "clone": false, 00:26:12.147 "esnap_clone": false 00:26:12.147 } 00:26:12.147 } 00:26:12.147 } 00:26:12.147 ] 00:26:12.147 10:21:33 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:12.147 10:21:33 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:12.147 10:21:33 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:12.410 [2024-06-10 10:21:34.176119] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:12.410 COMP_lvs0/lv0 00:26:12.410 10:21:34 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:12.410 10:21:34 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:12.671 10:21:34 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:12.931 [ 00:26:12.931 { 00:26:12.931 "name": "COMP_lvs0/lv0", 00:26:12.931 "aliases": [ 00:26:12.931 "193159c6-263d-57a4-95af-c67f35f9d7bf" 00:26:12.931 ], 00:26:12.931 "product_name": "compress", 00:26:12.931 "block_size": 512, 00:26:12.931 "num_blocks": 200704, 00:26:12.931 "uuid": "193159c6-263d-57a4-95af-c67f35f9d7bf", 00:26:12.931 "assigned_rate_limits": { 00:26:12.931 "rw_ios_per_sec": 0, 00:26:12.931 "rw_mbytes_per_sec": 0, 00:26:12.931 "r_mbytes_per_sec": 0, 00:26:12.931 "w_mbytes_per_sec": 0 00:26:12.931 }, 00:26:12.931 "claimed": false, 00:26:12.931 "zoned": false, 00:26:12.931 "supported_io_types": { 00:26:12.931 "read": true, 00:26:12.931 "write": true, 00:26:12.931 "unmap": false, 00:26:12.931 "write_zeroes": true, 00:26:12.931 "flush": false, 00:26:12.931 "reset": false, 00:26:12.931 "compare": false, 00:26:12.931 "compare_and_write": false, 00:26:12.931 "abort": false, 00:26:12.931 "nvme_admin": false, 00:26:12.931 "nvme_io": false 00:26:12.931 }, 00:26:12.931 "driver_specific": { 00:26:12.931 "compress": { 00:26:12.931 "name": "COMP_lvs0/lv0", 00:26:12.931 "base_bdev_name": "019bb72f-7b83-4bb8-a315-e0fc20da2b40" 00:26:12.931 } 00:26:12.931 } 00:26:12.931 } 00:26:12.931 ] 00:26:12.932 10:21:34 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:12.932 10:21:34 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:12.932 [2024-06-10 10:21:34.649655] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f24401b15c0 PMD being used: compress_qat 00:26:12.932 [2024-06-10 10:21:34.651297] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x208d880 PMD being used: compress_qat 00:26:12.932 Running I/O for 3 seconds... 00:26:16.231 00:26:16.231 Latency(us) 00:26:16.231 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:16.231 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:16.231 Verification LBA range: start 0x0 length 0x3100 00:26:16.231 COMP_lvs0/lv0 : 3.01 4350.02 16.99 0.00 0.00 7306.85 123.67 13006.38 00:26:16.231 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:16.231 Verification LBA range: start 0x3100 length 0x3100 00:26:16.231 COMP_lvs0/lv0 : 3.01 4431.06 17.31 0.00 0.00 7186.19 117.37 12855.14 00:26:16.231 =================================================================================================================== 00:26:16.231 Total : 8781.08 34.30 0.00 0.00 7245.98 117.37 13006.38 00:26:16.231 0 00:26:16.231 10:21:37 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:16.231 10:21:37 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:16.231 10:21:37 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:16.492 10:21:38 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:16.492 10:21:38 compress_compdev -- compress/compress.sh@78 -- # killprocess 1136791 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1136791 ']' 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1136791 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1136791 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1136791' 00:26:16.492 killing process with pid 1136791 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@968 -- # kill 1136791 00:26:16.492 Received shutdown signal, test time was about 3.000000 seconds 00:26:16.492 00:26:16.492 Latency(us) 00:26:16.492 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:16.492 =================================================================================================================== 00:26:16.492 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:16.492 10:21:38 compress_compdev -- common/autotest_common.sh@973 -- # wait 1136791 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1138855 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1138855 00:26:19.038 10:21:40 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1138855 ']' 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:19.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:19.038 10:21:40 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:19.038 [2024-06-10 10:21:40.625567] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:19.038 [2024-06-10 10:21:40.625620] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138855 ] 00:26:19.038 [2024-06-10 10:21:40.693957] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:19.038 [2024-06-10 10:21:40.758029] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:19.038 [2024-06-10 10:21:40.758035] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:19.608 [2024-06-10 10:21:41.167466] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:19.608 10:21:41 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:19.608 10:21:41 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:26:19.608 10:21:41 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:26:19.608 10:21:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:19.608 10:21:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:22.903 [2024-06-10 10:21:44.476849] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27db570 PMD being used: compress_qat 00:26:22.903 10:21:44 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:22.903 10:21:44 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:23.163 [ 00:26:23.163 { 00:26:23.163 "name": "Nvme0n1", 00:26:23.163 "aliases": [ 00:26:23.163 "17fd591e-44cb-4e23-83fb-7cd04a651e07" 00:26:23.163 ], 00:26:23.163 "product_name": "NVMe disk", 00:26:23.163 "block_size": 512, 00:26:23.163 "num_blocks": 3907029168, 00:26:23.163 "uuid": "17fd591e-44cb-4e23-83fb-7cd04a651e07", 00:26:23.163 "assigned_rate_limits": { 00:26:23.163 "rw_ios_per_sec": 0, 00:26:23.163 "rw_mbytes_per_sec": 0, 00:26:23.164 "r_mbytes_per_sec": 0, 00:26:23.164 "w_mbytes_per_sec": 0 00:26:23.164 }, 00:26:23.164 "claimed": false, 00:26:23.164 "zoned": false, 00:26:23.164 "supported_io_types": { 00:26:23.164 "read": true, 00:26:23.164 "write": true, 00:26:23.164 "unmap": true, 00:26:23.164 "write_zeroes": true, 00:26:23.164 "flush": true, 00:26:23.164 "reset": true, 00:26:23.164 "compare": false, 00:26:23.164 "compare_and_write": false, 00:26:23.164 "abort": true, 00:26:23.164 "nvme_admin": true, 00:26:23.164 "nvme_io": true 00:26:23.164 }, 00:26:23.164 "driver_specific": { 00:26:23.164 "nvme": [ 00:26:23.164 { 00:26:23.164 "pci_address": "0000:65:00.0", 00:26:23.164 "trid": { 00:26:23.164 "trtype": "PCIe", 00:26:23.164 "traddr": "0000:65:00.0" 00:26:23.164 }, 00:26:23.164 "ctrlr_data": { 00:26:23.164 "cntlid": 0, 00:26:23.164 "vendor_id": "0x8086", 00:26:23.164 "model_number": "INTEL SSDPE2KX020T8", 00:26:23.164 "serial_number": "PHLJ9512038S2P0BGN", 00:26:23.164 "firmware_revision": "VDV10184", 00:26:23.164 "oacs": { 00:26:23.164 "security": 0, 00:26:23.164 "format": 1, 00:26:23.164 "firmware": 1, 00:26:23.164 "ns_manage": 1 00:26:23.164 }, 00:26:23.164 "multi_ctrlr": false, 00:26:23.164 "ana_reporting": false 00:26:23.164 }, 00:26:23.164 "vs": { 00:26:23.164 "nvme_version": "1.2" 00:26:23.164 }, 00:26:23.164 "ns_data": { 00:26:23.164 "id": 1, 00:26:23.164 "can_share": false 00:26:23.164 } 00:26:23.164 } 00:26:23.164 ], 00:26:23.164 "mp_policy": "active_passive" 00:26:23.164 } 00:26:23.164 } 00:26:23.164 ] 00:26:23.164 10:21:44 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:23.164 10:21:44 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:23.424 [2024-06-10 10:21:45.096628] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27dc4a0 PMD being used: compress_qat 00:26:24.365 93e601b9-0135-4ef7-9a58-ed4bef73fa9e 00:26:24.365 10:21:46 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:24.624 f39b4f6e-e4a1-4f67-9bd4-4d346cebd2fb 00:26:24.624 10:21:46 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:24.624 10:21:46 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:24.884 10:21:46 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:25.144 [ 00:26:25.144 { 00:26:25.144 "name": "f39b4f6e-e4a1-4f67-9bd4-4d346cebd2fb", 00:26:25.144 "aliases": [ 00:26:25.144 "lvs0/lv0" 00:26:25.144 ], 00:26:25.144 "product_name": "Logical Volume", 00:26:25.144 "block_size": 512, 00:26:25.144 "num_blocks": 204800, 00:26:25.144 "uuid": "f39b4f6e-e4a1-4f67-9bd4-4d346cebd2fb", 00:26:25.144 "assigned_rate_limits": { 00:26:25.144 "rw_ios_per_sec": 0, 00:26:25.144 "rw_mbytes_per_sec": 0, 00:26:25.144 "r_mbytes_per_sec": 0, 00:26:25.144 "w_mbytes_per_sec": 0 00:26:25.144 }, 00:26:25.144 "claimed": false, 00:26:25.144 "zoned": false, 00:26:25.144 "supported_io_types": { 00:26:25.144 "read": true, 00:26:25.144 "write": true, 00:26:25.144 "unmap": true, 00:26:25.144 "write_zeroes": true, 00:26:25.144 "flush": false, 00:26:25.144 "reset": true, 00:26:25.144 "compare": false, 00:26:25.144 "compare_and_write": false, 00:26:25.144 "abort": false, 00:26:25.144 "nvme_admin": false, 00:26:25.144 "nvme_io": false 00:26:25.144 }, 00:26:25.144 "driver_specific": { 00:26:25.144 "lvol": { 00:26:25.144 "lvol_store_uuid": "93e601b9-0135-4ef7-9a58-ed4bef73fa9e", 00:26:25.144 "base_bdev": "Nvme0n1", 00:26:25.144 "thin_provision": true, 00:26:25.144 "num_allocated_clusters": 0, 00:26:25.144 "snapshot": false, 00:26:25.144 "clone": false, 00:26:25.144 "esnap_clone": false 00:26:25.144 } 00:26:25.144 } 00:26:25.144 } 00:26:25.144 ] 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:25.144 10:21:46 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:26:25.144 10:21:46 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:26:25.144 [2024-06-10 10:21:46.920104] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:25.144 COMP_lvs0/lv0 00:26:25.144 10:21:46 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:25.144 10:21:46 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:25.404 10:21:47 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:25.663 [ 00:26:25.663 { 00:26:25.663 "name": "COMP_lvs0/lv0", 00:26:25.663 "aliases": [ 00:26:25.663 "0069d667-a13d-5fb6-aab6-737cdc143899" 00:26:25.663 ], 00:26:25.663 "product_name": "compress", 00:26:25.663 "block_size": 512, 00:26:25.663 "num_blocks": 200704, 00:26:25.663 "uuid": "0069d667-a13d-5fb6-aab6-737cdc143899", 00:26:25.663 "assigned_rate_limits": { 00:26:25.663 "rw_ios_per_sec": 0, 00:26:25.663 "rw_mbytes_per_sec": 0, 00:26:25.663 "r_mbytes_per_sec": 0, 00:26:25.663 "w_mbytes_per_sec": 0 00:26:25.663 }, 00:26:25.663 "claimed": false, 00:26:25.663 "zoned": false, 00:26:25.663 "supported_io_types": { 00:26:25.663 "read": true, 00:26:25.663 "write": true, 00:26:25.663 "unmap": false, 00:26:25.663 "write_zeroes": true, 00:26:25.663 "flush": false, 00:26:25.663 "reset": false, 00:26:25.663 "compare": false, 00:26:25.663 "compare_and_write": false, 00:26:25.663 "abort": false, 00:26:25.663 "nvme_admin": false, 00:26:25.663 "nvme_io": false 00:26:25.663 }, 00:26:25.663 "driver_specific": { 00:26:25.663 "compress": { 00:26:25.663 "name": "COMP_lvs0/lv0", 00:26:25.663 "base_bdev_name": "f39b4f6e-e4a1-4f67-9bd4-4d346cebd2fb" 00:26:25.663 } 00:26:25.663 } 00:26:25.663 } 00:26:25.663 ] 00:26:25.663 10:21:47 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:25.663 10:21:47 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:25.663 [2024-06-10 10:21:47.453733] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2ffc1b15c0 PMD being used: compress_qat 00:26:25.663 [2024-06-10 10:21:47.455335] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26119a0 PMD being used: compress_qat 00:26:25.663 Running I/O for 3 seconds... 00:26:29.000 00:26:29.000 Latency(us) 00:26:29.000 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:29.000 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:29.000 Verification LBA range: start 0x0 length 0x3100 00:26:29.000 COMP_lvs0/lv0 : 3.01 4302.62 16.81 0.00 0.00 7390.48 123.67 13107.20 00:26:29.000 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:29.000 Verification LBA range: start 0x3100 length 0x3100 00:26:29.000 COMP_lvs0/lv0 : 3.01 4410.12 17.23 0.00 0.00 7218.70 118.15 12754.31 00:26:29.000 =================================================================================================================== 00:26:29.000 Total : 8712.74 34.03 0.00 0.00 7303.52 118.15 13107.20 00:26:29.000 0 00:26:29.000 10:21:50 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:29.000 10:21:50 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:29.000 10:21:50 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:29.000 10:21:50 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:29.000 10:21:50 compress_compdev -- compress/compress.sh@78 -- # killprocess 1138855 00:26:29.000 10:21:50 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1138855 ']' 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1138855 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1138855 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1138855' 00:26:29.259 killing process with pid 1138855 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@968 -- # kill 1138855 00:26:29.259 Received shutdown signal, test time was about 3.000000 seconds 00:26:29.259 00:26:29.259 Latency(us) 00:26:29.259 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:29.259 =================================================================================================================== 00:26:29.259 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:29.259 10:21:50 compress_compdev -- common/autotest_common.sh@973 -- # wait 1138855 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1140851 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1140851 00:26:31.799 10:21:53 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:31.799 10:21:53 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1140851 ']' 00:26:31.800 10:21:53 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:31.800 10:21:53 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:31.800 10:21:53 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:31.800 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:31.800 10:21:53 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:31.800 10:21:53 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:31.800 [2024-06-10 10:21:53.375614] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:31.800 [2024-06-10 10:21:53.375667] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1140851 ] 00:26:31.800 [2024-06-10 10:21:53.443179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:31.800 [2024-06-10 10:21:53.507366] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:31.800 [2024-06-10 10:21:53.507371] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:32.059 [2024-06-10 10:21:53.907353] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:32.628 10:21:54 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:32.628 10:21:54 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:26:32.628 10:21:54 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:26:32.628 10:21:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:32.628 10:21:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:35.923 [2024-06-10 10:21:57.243927] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2356570 PMD being used: compress_qat 00:26:35.923 10:21:57 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:35.923 [ 00:26:35.923 { 00:26:35.923 "name": "Nvme0n1", 00:26:35.923 "aliases": [ 00:26:35.923 "7c8355b0-1937-4402-900e-47174a03d8ae" 00:26:35.923 ], 00:26:35.923 "product_name": "NVMe disk", 00:26:35.923 "block_size": 512, 00:26:35.923 "num_blocks": 3907029168, 00:26:35.923 "uuid": "7c8355b0-1937-4402-900e-47174a03d8ae", 00:26:35.923 "assigned_rate_limits": { 00:26:35.923 "rw_ios_per_sec": 0, 00:26:35.923 "rw_mbytes_per_sec": 0, 00:26:35.923 "r_mbytes_per_sec": 0, 00:26:35.923 "w_mbytes_per_sec": 0 00:26:35.923 }, 00:26:35.923 "claimed": false, 00:26:35.923 "zoned": false, 00:26:35.923 "supported_io_types": { 00:26:35.923 "read": true, 00:26:35.923 "write": true, 00:26:35.923 "unmap": true, 00:26:35.923 "write_zeroes": true, 00:26:35.923 "flush": true, 00:26:35.923 "reset": true, 00:26:35.923 "compare": false, 00:26:35.923 "compare_and_write": false, 00:26:35.923 "abort": true, 00:26:35.923 "nvme_admin": true, 00:26:35.923 "nvme_io": true 00:26:35.923 }, 00:26:35.923 "driver_specific": { 00:26:35.923 "nvme": [ 00:26:35.923 { 00:26:35.923 "pci_address": "0000:65:00.0", 00:26:35.923 "trid": { 00:26:35.923 "trtype": "PCIe", 00:26:35.923 "traddr": "0000:65:00.0" 00:26:35.923 }, 00:26:35.923 "ctrlr_data": { 00:26:35.923 "cntlid": 0, 00:26:35.923 "vendor_id": "0x8086", 00:26:35.923 "model_number": "INTEL SSDPE2KX020T8", 00:26:35.923 "serial_number": "PHLJ9512038S2P0BGN", 00:26:35.923 "firmware_revision": "VDV10184", 00:26:35.923 "oacs": { 00:26:35.923 "security": 0, 00:26:35.923 "format": 1, 00:26:35.923 "firmware": 1, 00:26:35.923 "ns_manage": 1 00:26:35.923 }, 00:26:35.923 "multi_ctrlr": false, 00:26:35.923 "ana_reporting": false 00:26:35.923 }, 00:26:35.923 "vs": { 00:26:35.923 "nvme_version": "1.2" 00:26:35.923 }, 00:26:35.923 "ns_data": { 00:26:35.923 "id": 1, 00:26:35.923 "can_share": false 00:26:35.923 } 00:26:35.923 } 00:26:35.923 ], 00:26:35.923 "mp_policy": "active_passive" 00:26:35.923 } 00:26:35.923 } 00:26:35.923 ] 00:26:35.923 10:21:57 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:35.923 10:21:57 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:36.184 [2024-06-10 10:21:57.836189] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23574a0 PMD being used: compress_qat 00:26:37.125 c358dbc1-593b-48ca-9364-afdf3b44ef67 00:26:37.125 10:21:58 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:37.385 55157db2-4123-4cf1-9ced-b662db857d44 00:26:37.385 10:21:59 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:37.385 10:21:59 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:37.645 [ 00:26:37.645 { 00:26:37.645 "name": "55157db2-4123-4cf1-9ced-b662db857d44", 00:26:37.645 "aliases": [ 00:26:37.645 "lvs0/lv0" 00:26:37.645 ], 00:26:37.645 "product_name": "Logical Volume", 00:26:37.645 "block_size": 512, 00:26:37.645 "num_blocks": 204800, 00:26:37.645 "uuid": "55157db2-4123-4cf1-9ced-b662db857d44", 00:26:37.645 "assigned_rate_limits": { 00:26:37.645 "rw_ios_per_sec": 0, 00:26:37.645 "rw_mbytes_per_sec": 0, 00:26:37.645 "r_mbytes_per_sec": 0, 00:26:37.645 "w_mbytes_per_sec": 0 00:26:37.645 }, 00:26:37.645 "claimed": false, 00:26:37.645 "zoned": false, 00:26:37.645 "supported_io_types": { 00:26:37.645 "read": true, 00:26:37.645 "write": true, 00:26:37.645 "unmap": true, 00:26:37.645 "write_zeroes": true, 00:26:37.645 "flush": false, 00:26:37.645 "reset": true, 00:26:37.645 "compare": false, 00:26:37.645 "compare_and_write": false, 00:26:37.645 "abort": false, 00:26:37.645 "nvme_admin": false, 00:26:37.645 "nvme_io": false 00:26:37.645 }, 00:26:37.645 "driver_specific": { 00:26:37.645 "lvol": { 00:26:37.645 "lvol_store_uuid": "c358dbc1-593b-48ca-9364-afdf3b44ef67", 00:26:37.645 "base_bdev": "Nvme0n1", 00:26:37.645 "thin_provision": true, 00:26:37.645 "num_allocated_clusters": 0, 00:26:37.645 "snapshot": false, 00:26:37.645 "clone": false, 00:26:37.645 "esnap_clone": false 00:26:37.645 } 00:26:37.645 } 00:26:37.645 } 00:26:37.645 ] 00:26:37.645 10:21:59 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:37.645 10:21:59 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:26:37.645 10:21:59 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:26:37.905 [2024-06-10 10:21:59.614245] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:37.905 COMP_lvs0/lv0 00:26:37.905 10:21:59 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:37.905 10:21:59 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:38.165 10:21:59 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:38.165 [ 00:26:38.165 { 00:26:38.165 "name": "COMP_lvs0/lv0", 00:26:38.165 "aliases": [ 00:26:38.165 "35c7f3e6-cc80-5c22-b8f6-5c4366644cdd" 00:26:38.165 ], 00:26:38.165 "product_name": "compress", 00:26:38.165 "block_size": 4096, 00:26:38.165 "num_blocks": 25088, 00:26:38.165 "uuid": "35c7f3e6-cc80-5c22-b8f6-5c4366644cdd", 00:26:38.165 "assigned_rate_limits": { 00:26:38.165 "rw_ios_per_sec": 0, 00:26:38.165 "rw_mbytes_per_sec": 0, 00:26:38.165 "r_mbytes_per_sec": 0, 00:26:38.165 "w_mbytes_per_sec": 0 00:26:38.165 }, 00:26:38.165 "claimed": false, 00:26:38.165 "zoned": false, 00:26:38.165 "supported_io_types": { 00:26:38.165 "read": true, 00:26:38.165 "write": true, 00:26:38.165 "unmap": false, 00:26:38.165 "write_zeroes": true, 00:26:38.165 "flush": false, 00:26:38.165 "reset": false, 00:26:38.165 "compare": false, 00:26:38.165 "compare_and_write": false, 00:26:38.165 "abort": false, 00:26:38.165 "nvme_admin": false, 00:26:38.165 "nvme_io": false 00:26:38.165 }, 00:26:38.165 "driver_specific": { 00:26:38.165 "compress": { 00:26:38.165 "name": "COMP_lvs0/lv0", 00:26:38.165 "base_bdev_name": "55157db2-4123-4cf1-9ced-b662db857d44" 00:26:38.165 } 00:26:38.165 } 00:26:38.165 } 00:26:38.165 ] 00:26:38.165 10:21:59 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:38.165 10:21:59 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:38.424 [2024-06-10 10:22:00.084297] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2236ab0 PMD being used: compress_qat 00:26:38.424 [2024-06-10 10:22:00.085590] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f451c19bc10 PMD being used: compress_qat 00:26:38.424 Running I/O for 3 seconds... 00:26:41.720 00:26:41.720 Latency(us) 00:26:41.720 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:41.720 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:41.720 Verification LBA range: start 0x0 length 0x3100 00:26:41.720 COMP_lvs0/lv0 : 3.01 3813.07 14.89 0.00 0.00 8345.88 172.50 15728.64 00:26:41.720 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:41.720 Verification LBA range: start 0x3100 length 0x3100 00:26:41.720 COMP_lvs0/lv0 : 3.01 3713.59 14.51 0.00 0.00 8580.24 179.59 16131.94 00:26:41.720 =================================================================================================================== 00:26:41.720 Total : 7526.67 29.40 0.00 0.00 8461.45 172.50 16131.94 00:26:41.720 0 00:26:41.720 10:22:03 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:41.720 10:22:03 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:41.720 10:22:03 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:41.720 10:22:03 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:41.720 10:22:03 compress_compdev -- compress/compress.sh@78 -- # killprocess 1140851 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1140851 ']' 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1140851 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1140851 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1140851' 00:26:41.720 killing process with pid 1140851 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@968 -- # kill 1140851 00:26:41.720 Received shutdown signal, test time was about 3.000000 seconds 00:26:41.720 00:26:41.720 Latency(us) 00:26:41.720 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:41.720 =================================================================================================================== 00:26:41.720 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:41.720 10:22:03 compress_compdev -- common/autotest_common.sh@973 -- # wait 1140851 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1142974 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1142974 00:26:44.259 10:22:05 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1142974 ']' 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:44.259 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:44.259 10:22:05 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:44.259 [2024-06-10 10:22:05.888156] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:44.259 [2024-06-10 10:22:05.888208] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1142974 ] 00:26:44.259 [2024-06-10 10:22:05.975271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:44.259 [2024-06-10 10:22:06.066805] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:44.259 [2024-06-10 10:22:06.066939] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:44.259 [2024-06-10 10:22:06.067107] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:44.828 [2024-06-10 10:22:06.477958] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:45.088 10:22:06 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:45.088 10:22:06 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:26:45.088 10:22:06 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:26:45.088 10:22:06 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:45.088 10:22:06 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:48.381 [2024-06-10 10:22:09.764641] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15e0fb0 PMD being used: compress_qat 00:26:48.381 10:22:09 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:48.381 10:22:09 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:48.381 [ 00:26:48.381 { 00:26:48.381 "name": "Nvme0n1", 00:26:48.381 "aliases": [ 00:26:48.381 "e71e167f-7186-4382-a356-22579054c262" 00:26:48.381 ], 00:26:48.381 "product_name": "NVMe disk", 00:26:48.381 "block_size": 512, 00:26:48.381 "num_blocks": 3907029168, 00:26:48.381 "uuid": "e71e167f-7186-4382-a356-22579054c262", 00:26:48.381 "assigned_rate_limits": { 00:26:48.381 "rw_ios_per_sec": 0, 00:26:48.381 "rw_mbytes_per_sec": 0, 00:26:48.381 "r_mbytes_per_sec": 0, 00:26:48.381 "w_mbytes_per_sec": 0 00:26:48.381 }, 00:26:48.381 "claimed": false, 00:26:48.381 "zoned": false, 00:26:48.381 "supported_io_types": { 00:26:48.381 "read": true, 00:26:48.381 "write": true, 00:26:48.381 "unmap": true, 00:26:48.381 "write_zeroes": true, 00:26:48.381 "flush": true, 00:26:48.381 "reset": true, 00:26:48.381 "compare": false, 00:26:48.381 "compare_and_write": false, 00:26:48.381 "abort": true, 00:26:48.381 "nvme_admin": true, 00:26:48.381 "nvme_io": true 00:26:48.381 }, 00:26:48.381 "driver_specific": { 00:26:48.381 "nvme": [ 00:26:48.381 { 00:26:48.381 "pci_address": "0000:65:00.0", 00:26:48.381 "trid": { 00:26:48.381 "trtype": "PCIe", 00:26:48.381 "traddr": "0000:65:00.0" 00:26:48.381 }, 00:26:48.381 "ctrlr_data": { 00:26:48.381 "cntlid": 0, 00:26:48.381 "vendor_id": "0x8086", 00:26:48.381 "model_number": "INTEL SSDPE2KX020T8", 00:26:48.381 "serial_number": "PHLJ9512038S2P0BGN", 00:26:48.381 "firmware_revision": "VDV10184", 00:26:48.381 "oacs": { 00:26:48.381 "security": 0, 00:26:48.381 "format": 1, 00:26:48.381 "firmware": 1, 00:26:48.381 "ns_manage": 1 00:26:48.381 }, 00:26:48.381 "multi_ctrlr": false, 00:26:48.381 "ana_reporting": false 00:26:48.381 }, 00:26:48.381 "vs": { 00:26:48.381 "nvme_version": "1.2" 00:26:48.381 }, 00:26:48.381 "ns_data": { 00:26:48.381 "id": 1, 00:26:48.381 "can_share": false 00:26:48.381 } 00:26:48.381 } 00:26:48.381 ], 00:26:48.381 "mp_policy": "active_passive" 00:26:48.381 } 00:26:48.381 } 00:26:48.381 ] 00:26:48.381 10:22:10 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:48.381 10:22:10 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:48.642 [2024-06-10 10:22:10.382324] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15e1ee0 PMD being used: compress_qat 00:26:50.025 4eabf2ce-c18c-4eb7-bfbe-a42dd18e2c30 00:26:50.025 10:22:11 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:50.025 e2f57acc-34b7-490e-a32b-44a0fc122faf 00:26:50.025 10:22:11 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:50.025 10:22:11 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:50.286 [ 00:26:50.286 { 00:26:50.286 "name": "e2f57acc-34b7-490e-a32b-44a0fc122faf", 00:26:50.286 "aliases": [ 00:26:50.286 "lvs0/lv0" 00:26:50.286 ], 00:26:50.286 "product_name": "Logical Volume", 00:26:50.286 "block_size": 512, 00:26:50.286 "num_blocks": 204800, 00:26:50.286 "uuid": "e2f57acc-34b7-490e-a32b-44a0fc122faf", 00:26:50.286 "assigned_rate_limits": { 00:26:50.286 "rw_ios_per_sec": 0, 00:26:50.286 "rw_mbytes_per_sec": 0, 00:26:50.286 "r_mbytes_per_sec": 0, 00:26:50.286 "w_mbytes_per_sec": 0 00:26:50.286 }, 00:26:50.286 "claimed": false, 00:26:50.286 "zoned": false, 00:26:50.286 "supported_io_types": { 00:26:50.286 "read": true, 00:26:50.286 "write": true, 00:26:50.286 "unmap": true, 00:26:50.286 "write_zeroes": true, 00:26:50.286 "flush": false, 00:26:50.286 "reset": true, 00:26:50.286 "compare": false, 00:26:50.286 "compare_and_write": false, 00:26:50.286 "abort": false, 00:26:50.286 "nvme_admin": false, 00:26:50.286 "nvme_io": false 00:26:50.286 }, 00:26:50.286 "driver_specific": { 00:26:50.286 "lvol": { 00:26:50.286 "lvol_store_uuid": "4eabf2ce-c18c-4eb7-bfbe-a42dd18e2c30", 00:26:50.286 "base_bdev": "Nvme0n1", 00:26:50.286 "thin_provision": true, 00:26:50.286 "num_allocated_clusters": 0, 00:26:50.286 "snapshot": false, 00:26:50.286 "clone": false, 00:26:50.286 "esnap_clone": false 00:26:50.286 } 00:26:50.286 } 00:26:50.286 } 00:26:50.286 ] 00:26:50.286 10:22:12 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:50.287 10:22:12 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:50.287 10:22:12 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:50.547 [2024-06-10 10:22:12.276494] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:50.547 COMP_lvs0/lv0 00:26:50.547 10:22:12 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:50.547 10:22:12 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:50.807 10:22:12 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:51.068 [ 00:26:51.068 { 00:26:51.068 "name": "COMP_lvs0/lv0", 00:26:51.068 "aliases": [ 00:26:51.068 "838910f8-ec93-5c24-a771-a948907599eb" 00:26:51.068 ], 00:26:51.068 "product_name": "compress", 00:26:51.068 "block_size": 512, 00:26:51.068 "num_blocks": 200704, 00:26:51.068 "uuid": "838910f8-ec93-5c24-a771-a948907599eb", 00:26:51.068 "assigned_rate_limits": { 00:26:51.068 "rw_ios_per_sec": 0, 00:26:51.068 "rw_mbytes_per_sec": 0, 00:26:51.068 "r_mbytes_per_sec": 0, 00:26:51.068 "w_mbytes_per_sec": 0 00:26:51.068 }, 00:26:51.068 "claimed": false, 00:26:51.068 "zoned": false, 00:26:51.068 "supported_io_types": { 00:26:51.068 "read": true, 00:26:51.068 "write": true, 00:26:51.068 "unmap": false, 00:26:51.068 "write_zeroes": true, 00:26:51.068 "flush": false, 00:26:51.068 "reset": false, 00:26:51.068 "compare": false, 00:26:51.068 "compare_and_write": false, 00:26:51.068 "abort": false, 00:26:51.068 "nvme_admin": false, 00:26:51.068 "nvme_io": false 00:26:51.068 }, 00:26:51.068 "driver_specific": { 00:26:51.068 "compress": { 00:26:51.068 "name": "COMP_lvs0/lv0", 00:26:51.068 "base_bdev_name": "e2f57acc-34b7-490e-a32b-44a0fc122faf" 00:26:51.068 } 00:26:51.068 } 00:26:51.068 } 00:26:51.068 ] 00:26:51.068 10:22:12 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:51.068 10:22:12 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:51.068 [2024-06-10 10:22:12.813788] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fbc481b1350 PMD being used: compress_qat 00:26:51.068 I/O targets: 00:26:51.068 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:26:51.068 00:26:51.068 00:26:51.068 CUnit - A unit testing framework for C - Version 2.1-3 00:26:51.068 http://cunit.sourceforge.net/ 00:26:51.068 00:26:51.068 00:26:51.068 Suite: bdevio tests on: COMP_lvs0/lv0 00:26:51.068 Test: blockdev write read block ...passed 00:26:51.068 Test: blockdev write zeroes read block ...passed 00:26:51.068 Test: blockdev write zeroes read no split ...passed 00:26:51.068 Test: blockdev write zeroes read split ...passed 00:26:51.068 Test: blockdev write zeroes read split partial ...passed 00:26:51.068 Test: blockdev reset ...[2024-06-10 10:22:12.868238] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:26:51.068 passed 00:26:51.068 Test: blockdev write read 8 blocks ...passed 00:26:51.068 Test: blockdev write read size > 128k ...passed 00:26:51.068 Test: blockdev write read invalid size ...passed 00:26:51.068 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:51.068 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:51.068 Test: blockdev write read max offset ...passed 00:26:51.068 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:51.068 Test: blockdev writev readv 8 blocks ...passed 00:26:51.068 Test: blockdev writev readv 30 x 1block ...passed 00:26:51.068 Test: blockdev writev readv block ...passed 00:26:51.068 Test: blockdev writev readv size > 128k ...passed 00:26:51.068 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:51.068 Test: blockdev comparev and writev ...passed 00:26:51.068 Test: blockdev nvme passthru rw ...passed 00:26:51.068 Test: blockdev nvme passthru vendor specific ...passed 00:26:51.068 Test: blockdev nvme admin passthru ...passed 00:26:51.068 Test: blockdev copy ...passed 00:26:51.068 00:26:51.068 Run Summary: Type Total Ran Passed Failed Inactive 00:26:51.068 suites 1 1 n/a 0 0 00:26:51.068 tests 23 23 23 0 0 00:26:51.068 asserts 130 130 130 0 n/a 00:26:51.068 00:26:51.068 Elapsed time = 0.183 seconds 00:26:51.068 0 00:26:51.068 10:22:12 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:26:51.068 10:22:12 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:51.329 10:22:13 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:51.589 10:22:13 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:26:51.589 10:22:13 compress_compdev -- compress/compress.sh@62 -- # killprocess 1142974 00:26:51.589 10:22:13 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1142974 ']' 00:26:51.589 10:22:13 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1142974 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1142974 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1142974' 00:26:51.590 killing process with pid 1142974 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@968 -- # kill 1142974 00:26:51.590 10:22:13 compress_compdev -- common/autotest_common.sh@973 -- # wait 1142974 00:26:54.195 10:22:15 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:26:54.195 10:22:15 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:26:54.195 00:26:54.195 real 0m48.118s 00:26:54.195 user 1m49.462s 00:26:54.195 sys 0m3.612s 00:26:54.195 10:22:15 compress_compdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:54.195 10:22:15 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:54.195 ************************************ 00:26:54.195 END TEST compress_compdev 00:26:54.195 ************************************ 00:26:54.195 10:22:15 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:54.195 10:22:15 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:54.195 10:22:15 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:54.195 10:22:15 -- common/autotest_common.sh@10 -- # set +x 00:26:54.195 ************************************ 00:26:54.195 START TEST compress_isal 00:26:54.195 ************************************ 00:26:54.195 10:22:15 compress_isal -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:54.195 * Looking for test storage... 00:26:54.195 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:54.195 10:22:15 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:54.195 10:22:15 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:54.195 10:22:15 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:54.195 10:22:15 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:54.195 10:22:15 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.195 10:22:15 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.195 10:22:15 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.195 10:22:15 compress_isal -- paths/export.sh@5 -- # export PATH 00:26:54.195 10:22:15 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@47 -- # : 0 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:54.195 10:22:15 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:54.195 10:22:15 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:54.195 10:22:15 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1144739 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1144739 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1144739 ']' 00:26:54.195 10:22:16 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:54.195 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:54.195 10:22:16 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:54.457 [2024-06-10 10:22:16.061076] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:26:54.457 [2024-06-10 10:22:16.061146] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1144739 ] 00:26:54.457 [2024-06-10 10:22:16.136193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:54.457 [2024-06-10 10:22:16.207973] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:26:54.457 [2024-06-10 10:22:16.207979] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:55.399 10:22:16 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:55.399 10:22:16 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:26:55.399 10:22:16 compress_isal -- compress/compress.sh@74 -- # create_vols 00:26:55.399 10:22:16 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:55.399 10:22:16 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:58.700 10:22:19 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:58.700 10:22:19 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:58.700 10:22:20 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:58.700 [ 00:26:58.700 { 00:26:58.700 "name": "Nvme0n1", 00:26:58.700 "aliases": [ 00:26:58.700 "dd974d3a-8d31-4395-9aaf-b15b56cba4c7" 00:26:58.700 ], 00:26:58.700 "product_name": "NVMe disk", 00:26:58.700 "block_size": 512, 00:26:58.700 "num_blocks": 3907029168, 00:26:58.700 "uuid": "dd974d3a-8d31-4395-9aaf-b15b56cba4c7", 00:26:58.700 "assigned_rate_limits": { 00:26:58.700 "rw_ios_per_sec": 0, 00:26:58.700 "rw_mbytes_per_sec": 0, 00:26:58.700 "r_mbytes_per_sec": 0, 00:26:58.700 "w_mbytes_per_sec": 0 00:26:58.700 }, 00:26:58.700 "claimed": false, 00:26:58.700 "zoned": false, 00:26:58.700 "supported_io_types": { 00:26:58.700 "read": true, 00:26:58.700 "write": true, 00:26:58.700 "unmap": true, 00:26:58.700 "write_zeroes": true, 00:26:58.700 "flush": true, 00:26:58.700 "reset": true, 00:26:58.700 "compare": false, 00:26:58.700 "compare_and_write": false, 00:26:58.700 "abort": true, 00:26:58.700 "nvme_admin": true, 00:26:58.700 "nvme_io": true 00:26:58.700 }, 00:26:58.700 "driver_specific": { 00:26:58.700 "nvme": [ 00:26:58.700 { 00:26:58.700 "pci_address": "0000:65:00.0", 00:26:58.700 "trid": { 00:26:58.700 "trtype": "PCIe", 00:26:58.700 "traddr": "0000:65:00.0" 00:26:58.700 }, 00:26:58.700 "ctrlr_data": { 00:26:58.700 "cntlid": 0, 00:26:58.700 "vendor_id": "0x8086", 00:26:58.700 "model_number": "INTEL SSDPE2KX020T8", 00:26:58.700 "serial_number": "PHLJ9512038S2P0BGN", 00:26:58.700 "firmware_revision": "VDV10184", 00:26:58.700 "oacs": { 00:26:58.700 "security": 0, 00:26:58.700 "format": 1, 00:26:58.700 "firmware": 1, 00:26:58.700 "ns_manage": 1 00:26:58.700 }, 00:26:58.700 "multi_ctrlr": false, 00:26:58.700 "ana_reporting": false 00:26:58.700 }, 00:26:58.700 "vs": { 00:26:58.700 "nvme_version": "1.2" 00:26:58.700 }, 00:26:58.700 "ns_data": { 00:26:58.700 "id": 1, 00:26:58.700 "can_share": false 00:26:58.700 } 00:26:58.700 } 00:26:58.700 ], 00:26:58.700 "mp_policy": "active_passive" 00:26:58.700 } 00:26:58.700 } 00:26:58.700 ] 00:26:58.700 10:22:20 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:58.700 10:22:20 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:00.083 60eff270-5e49-4518-8e07-8173c7d0ad5e 00:27:00.083 10:22:21 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:00.083 51603682-5af0-4114-a786-56c1ad7149bc 00:27:00.083 10:22:21 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:00.084 10:22:21 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:00.343 [ 00:27:00.344 { 00:27:00.344 "name": "51603682-5af0-4114-a786-56c1ad7149bc", 00:27:00.344 "aliases": [ 00:27:00.344 "lvs0/lv0" 00:27:00.344 ], 00:27:00.344 "product_name": "Logical Volume", 00:27:00.344 "block_size": 512, 00:27:00.344 "num_blocks": 204800, 00:27:00.344 "uuid": "51603682-5af0-4114-a786-56c1ad7149bc", 00:27:00.344 "assigned_rate_limits": { 00:27:00.344 "rw_ios_per_sec": 0, 00:27:00.344 "rw_mbytes_per_sec": 0, 00:27:00.344 "r_mbytes_per_sec": 0, 00:27:00.344 "w_mbytes_per_sec": 0 00:27:00.344 }, 00:27:00.344 "claimed": false, 00:27:00.344 "zoned": false, 00:27:00.344 "supported_io_types": { 00:27:00.344 "read": true, 00:27:00.344 "write": true, 00:27:00.344 "unmap": true, 00:27:00.344 "write_zeroes": true, 00:27:00.344 "flush": false, 00:27:00.344 "reset": true, 00:27:00.344 "compare": false, 00:27:00.344 "compare_and_write": false, 00:27:00.344 "abort": false, 00:27:00.344 "nvme_admin": false, 00:27:00.344 "nvme_io": false 00:27:00.344 }, 00:27:00.344 "driver_specific": { 00:27:00.344 "lvol": { 00:27:00.344 "lvol_store_uuid": "60eff270-5e49-4518-8e07-8173c7d0ad5e", 00:27:00.344 "base_bdev": "Nvme0n1", 00:27:00.344 "thin_provision": true, 00:27:00.344 "num_allocated_clusters": 0, 00:27:00.344 "snapshot": false, 00:27:00.344 "clone": false, 00:27:00.344 "esnap_clone": false 00:27:00.344 } 00:27:00.344 } 00:27:00.344 } 00:27:00.344 ] 00:27:00.344 10:22:22 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:00.344 10:22:22 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:00.344 10:22:22 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:00.602 [2024-06-10 10:22:22.285867] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:00.602 COMP_lvs0/lv0 00:27:00.602 10:22:22 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:00.602 10:22:22 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:00.862 10:22:22 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:00.862 [ 00:27:00.862 { 00:27:00.862 "name": "COMP_lvs0/lv0", 00:27:00.862 "aliases": [ 00:27:00.862 "c437409d-fabf-5e6b-826e-8a6b1f6bf663" 00:27:00.862 ], 00:27:00.862 "product_name": "compress", 00:27:00.862 "block_size": 512, 00:27:00.862 "num_blocks": 200704, 00:27:00.862 "uuid": "c437409d-fabf-5e6b-826e-8a6b1f6bf663", 00:27:00.862 "assigned_rate_limits": { 00:27:00.862 "rw_ios_per_sec": 0, 00:27:00.862 "rw_mbytes_per_sec": 0, 00:27:00.862 "r_mbytes_per_sec": 0, 00:27:00.862 "w_mbytes_per_sec": 0 00:27:00.862 }, 00:27:00.862 "claimed": false, 00:27:00.862 "zoned": false, 00:27:00.862 "supported_io_types": { 00:27:00.862 "read": true, 00:27:00.862 "write": true, 00:27:00.862 "unmap": false, 00:27:00.862 "write_zeroes": true, 00:27:00.862 "flush": false, 00:27:00.862 "reset": false, 00:27:00.862 "compare": false, 00:27:00.862 "compare_and_write": false, 00:27:00.862 "abort": false, 00:27:00.862 "nvme_admin": false, 00:27:00.862 "nvme_io": false 00:27:00.862 }, 00:27:00.862 "driver_specific": { 00:27:00.862 "compress": { 00:27:00.862 "name": "COMP_lvs0/lv0", 00:27:00.862 "base_bdev_name": "51603682-5af0-4114-a786-56c1ad7149bc" 00:27:00.862 } 00:27:00.862 } 00:27:00.862 } 00:27:00.862 ] 00:27:00.862 10:22:22 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:00.862 10:22:22 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:01.122 Running I/O for 3 seconds... 00:27:04.422 00:27:04.422 Latency(us) 00:27:04.422 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:04.422 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:04.422 Verification LBA range: start 0x0 length 0x3100 00:27:04.422 COMP_lvs0/lv0 : 3.01 3067.94 11.98 0.00 0.00 10386.36 64.59 17341.83 00:27:04.422 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:04.422 Verification LBA range: start 0x3100 length 0x3100 00:27:04.422 COMP_lvs0/lv0 : 3.01 3054.77 11.93 0.00 0.00 10434.84 57.50 17543.48 00:27:04.422 =================================================================================================================== 00:27:04.422 Total : 6122.71 23.92 0.00 0.00 10410.55 57.50 17543.48 00:27:04.422 0 00:27:04.422 10:22:25 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:04.422 10:22:25 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:04.422 10:22:26 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:04.422 10:22:26 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:04.422 10:22:26 compress_isal -- compress/compress.sh@78 -- # killprocess 1144739 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1144739 ']' 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1144739 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@954 -- # uname 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1144739 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1144739' 00:27:04.422 killing process with pid 1144739 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@968 -- # kill 1144739 00:27:04.422 Received shutdown signal, test time was about 3.000000 seconds 00:27:04.422 00:27:04.422 Latency(us) 00:27:04.422 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:04.422 =================================================================================================================== 00:27:04.422 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:04.422 10:22:26 compress_isal -- common/autotest_common.sh@973 -- # wait 1144739 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1146751 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1146751 00:27:06.967 10:22:28 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:06.967 10:22:28 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1146751 ']' 00:27:06.968 10:22:28 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:06.968 10:22:28 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:06.968 10:22:28 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:06.968 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:06.968 10:22:28 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:06.968 10:22:28 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:06.968 [2024-06-10 10:22:28.590223] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:06.968 [2024-06-10 10:22:28.590274] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1146751 ] 00:27:06.968 [2024-06-10 10:22:28.657946] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:06.968 [2024-06-10 10:22:28.721684] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:27:06.968 [2024-06-10 10:22:28.721690] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:07.908 10:22:29 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:07.908 10:22:29 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:27:07.908 10:22:29 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:27:07.908 10:22:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:07.908 10:22:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:11.211 10:22:32 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:11.211 10:22:32 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:11.211 [ 00:27:11.211 { 00:27:11.211 "name": "Nvme0n1", 00:27:11.211 "aliases": [ 00:27:11.211 "a92efccc-d597-429c-9b86-afa85f5cf15a" 00:27:11.211 ], 00:27:11.211 "product_name": "NVMe disk", 00:27:11.211 "block_size": 512, 00:27:11.211 "num_blocks": 3907029168, 00:27:11.211 "uuid": "a92efccc-d597-429c-9b86-afa85f5cf15a", 00:27:11.211 "assigned_rate_limits": { 00:27:11.211 "rw_ios_per_sec": 0, 00:27:11.211 "rw_mbytes_per_sec": 0, 00:27:11.211 "r_mbytes_per_sec": 0, 00:27:11.211 "w_mbytes_per_sec": 0 00:27:11.211 }, 00:27:11.211 "claimed": false, 00:27:11.211 "zoned": false, 00:27:11.211 "supported_io_types": { 00:27:11.211 "read": true, 00:27:11.211 "write": true, 00:27:11.211 "unmap": true, 00:27:11.211 "write_zeroes": true, 00:27:11.211 "flush": true, 00:27:11.211 "reset": true, 00:27:11.211 "compare": false, 00:27:11.211 "compare_and_write": false, 00:27:11.211 "abort": true, 00:27:11.211 "nvme_admin": true, 00:27:11.211 "nvme_io": true 00:27:11.211 }, 00:27:11.211 "driver_specific": { 00:27:11.211 "nvme": [ 00:27:11.211 { 00:27:11.211 "pci_address": "0000:65:00.0", 00:27:11.211 "trid": { 00:27:11.211 "trtype": "PCIe", 00:27:11.211 "traddr": "0000:65:00.0" 00:27:11.211 }, 00:27:11.211 "ctrlr_data": { 00:27:11.211 "cntlid": 0, 00:27:11.211 "vendor_id": "0x8086", 00:27:11.211 "model_number": "INTEL SSDPE2KX020T8", 00:27:11.211 "serial_number": "PHLJ9512038S2P0BGN", 00:27:11.211 "firmware_revision": "VDV10184", 00:27:11.211 "oacs": { 00:27:11.211 "security": 0, 00:27:11.211 "format": 1, 00:27:11.211 "firmware": 1, 00:27:11.211 "ns_manage": 1 00:27:11.211 }, 00:27:11.211 "multi_ctrlr": false, 00:27:11.211 "ana_reporting": false 00:27:11.211 }, 00:27:11.211 "vs": { 00:27:11.212 "nvme_version": "1.2" 00:27:11.212 }, 00:27:11.212 "ns_data": { 00:27:11.212 "id": 1, 00:27:11.212 "can_share": false 00:27:11.212 } 00:27:11.212 } 00:27:11.212 ], 00:27:11.212 "mp_policy": "active_passive" 00:27:11.212 } 00:27:11.212 } 00:27:11.212 ] 00:27:11.212 10:22:32 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:11.212 10:22:32 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:12.596 96e15bb4-6d65-49ff-99b8-eabd9bb428c9 00:27:12.596 10:22:34 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:12.596 6ff82c08-be94-41cb-8a6f-f0326d537fa7 00:27:12.596 10:22:34 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:12.596 10:22:34 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:12.856 10:22:34 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:12.856 [ 00:27:12.856 { 00:27:12.856 "name": "6ff82c08-be94-41cb-8a6f-f0326d537fa7", 00:27:12.856 "aliases": [ 00:27:12.856 "lvs0/lv0" 00:27:12.856 ], 00:27:12.856 "product_name": "Logical Volume", 00:27:12.856 "block_size": 512, 00:27:12.856 "num_blocks": 204800, 00:27:12.856 "uuid": "6ff82c08-be94-41cb-8a6f-f0326d537fa7", 00:27:12.856 "assigned_rate_limits": { 00:27:12.856 "rw_ios_per_sec": 0, 00:27:12.856 "rw_mbytes_per_sec": 0, 00:27:12.856 "r_mbytes_per_sec": 0, 00:27:12.856 "w_mbytes_per_sec": 0 00:27:12.856 }, 00:27:12.856 "claimed": false, 00:27:12.856 "zoned": false, 00:27:12.856 "supported_io_types": { 00:27:12.856 "read": true, 00:27:12.856 "write": true, 00:27:12.856 "unmap": true, 00:27:12.856 "write_zeroes": true, 00:27:12.856 "flush": false, 00:27:12.856 "reset": true, 00:27:12.856 "compare": false, 00:27:12.856 "compare_and_write": false, 00:27:12.856 "abort": false, 00:27:12.856 "nvme_admin": false, 00:27:12.856 "nvme_io": false 00:27:12.856 }, 00:27:12.856 "driver_specific": { 00:27:12.856 "lvol": { 00:27:12.856 "lvol_store_uuid": "96e15bb4-6d65-49ff-99b8-eabd9bb428c9", 00:27:12.856 "base_bdev": "Nvme0n1", 00:27:12.856 "thin_provision": true, 00:27:12.856 "num_allocated_clusters": 0, 00:27:12.856 "snapshot": false, 00:27:12.856 "clone": false, 00:27:12.856 "esnap_clone": false 00:27:12.856 } 00:27:12.856 } 00:27:12.856 } 00:27:12.856 ] 00:27:12.856 10:22:34 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:12.856 10:22:34 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:12.856 10:22:34 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:13.123 [2024-06-10 10:22:34.855668] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:13.123 COMP_lvs0/lv0 00:27:13.123 10:22:34 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:13.123 10:22:34 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:13.386 10:22:35 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:13.386 [ 00:27:13.386 { 00:27:13.386 "name": "COMP_lvs0/lv0", 00:27:13.386 "aliases": [ 00:27:13.386 "34397b76-5be1-5e17-adc8-918b0179ad08" 00:27:13.386 ], 00:27:13.386 "product_name": "compress", 00:27:13.386 "block_size": 512, 00:27:13.386 "num_blocks": 200704, 00:27:13.386 "uuid": "34397b76-5be1-5e17-adc8-918b0179ad08", 00:27:13.386 "assigned_rate_limits": { 00:27:13.386 "rw_ios_per_sec": 0, 00:27:13.386 "rw_mbytes_per_sec": 0, 00:27:13.386 "r_mbytes_per_sec": 0, 00:27:13.386 "w_mbytes_per_sec": 0 00:27:13.386 }, 00:27:13.386 "claimed": false, 00:27:13.386 "zoned": false, 00:27:13.386 "supported_io_types": { 00:27:13.386 "read": true, 00:27:13.386 "write": true, 00:27:13.386 "unmap": false, 00:27:13.386 "write_zeroes": true, 00:27:13.386 "flush": false, 00:27:13.386 "reset": false, 00:27:13.386 "compare": false, 00:27:13.386 "compare_and_write": false, 00:27:13.386 "abort": false, 00:27:13.386 "nvme_admin": false, 00:27:13.386 "nvme_io": false 00:27:13.386 }, 00:27:13.386 "driver_specific": { 00:27:13.386 "compress": { 00:27:13.386 "name": "COMP_lvs0/lv0", 00:27:13.386 "base_bdev_name": "6ff82c08-be94-41cb-8a6f-f0326d537fa7" 00:27:13.386 } 00:27:13.386 } 00:27:13.386 } 00:27:13.386 ] 00:27:13.386 10:22:35 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:13.386 10:22:35 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:13.646 Running I/O for 3 seconds... 00:27:16.945 00:27:16.945 Latency(us) 00:27:16.945 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:16.945 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:16.945 Verification LBA range: start 0x0 length 0x3100 00:27:16.945 COMP_lvs0/lv0 : 3.00 3629.26 14.18 0.00 0.00 8773.76 47.85 14922.04 00:27:16.945 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:16.945 Verification LBA range: start 0x3100 length 0x3100 00:27:16.945 COMP_lvs0/lv0 : 3.00 3618.88 14.14 0.00 0.00 8805.81 49.62 14821.22 00:27:16.945 =================================================================================================================== 00:27:16.945 Total : 7248.14 28.31 0.00 0.00 8789.76 47.85 14922.04 00:27:16.945 0 00:27:16.945 10:22:38 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:16.945 10:22:38 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:16.945 10:22:38 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:16.945 10:22:38 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:16.945 10:22:38 compress_isal -- compress/compress.sh@78 -- # killprocess 1146751 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1146751 ']' 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1146751 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@954 -- # uname 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1146751 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1146751' 00:27:16.945 killing process with pid 1146751 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@968 -- # kill 1146751 00:27:16.945 Received shutdown signal, test time was about 3.000000 seconds 00:27:16.945 00:27:16.945 Latency(us) 00:27:16.945 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:16.945 =================================================================================================================== 00:27:16.945 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:16.945 10:22:38 compress_isal -- common/autotest_common.sh@973 -- # wait 1146751 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1148896 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1148896 00:27:19.489 10:22:41 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1148896 ']' 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:19.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:19.489 10:22:41 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:19.489 [2024-06-10 10:22:41.245388] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:19.489 [2024-06-10 10:22:41.245443] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1148896 ] 00:27:19.489 [2024-06-10 10:22:41.313426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:19.749 [2024-06-10 10:22:41.376122] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:27:19.749 [2024-06-10 10:22:41.376127] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:20.321 10:22:42 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:20.321 10:22:42 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:27:20.321 10:22:42 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:27:20.321 10:22:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:20.321 10:22:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:23.678 10:22:45 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:23.678 10:22:45 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:23.678 [ 00:27:23.678 { 00:27:23.678 "name": "Nvme0n1", 00:27:23.678 "aliases": [ 00:27:23.678 "b53d4a8c-a86b-4db0-9a81-993ddb3e453c" 00:27:23.678 ], 00:27:23.678 "product_name": "NVMe disk", 00:27:23.678 "block_size": 512, 00:27:23.678 "num_blocks": 3907029168, 00:27:23.678 "uuid": "b53d4a8c-a86b-4db0-9a81-993ddb3e453c", 00:27:23.678 "assigned_rate_limits": { 00:27:23.678 "rw_ios_per_sec": 0, 00:27:23.678 "rw_mbytes_per_sec": 0, 00:27:23.678 "r_mbytes_per_sec": 0, 00:27:23.678 "w_mbytes_per_sec": 0 00:27:23.678 }, 00:27:23.678 "claimed": false, 00:27:23.678 "zoned": false, 00:27:23.678 "supported_io_types": { 00:27:23.678 "read": true, 00:27:23.678 "write": true, 00:27:23.678 "unmap": true, 00:27:23.678 "write_zeroes": true, 00:27:23.678 "flush": true, 00:27:23.678 "reset": true, 00:27:23.678 "compare": false, 00:27:23.678 "compare_and_write": false, 00:27:23.678 "abort": true, 00:27:23.678 "nvme_admin": true, 00:27:23.678 "nvme_io": true 00:27:23.678 }, 00:27:23.678 "driver_specific": { 00:27:23.678 "nvme": [ 00:27:23.678 { 00:27:23.678 "pci_address": "0000:65:00.0", 00:27:23.678 "trid": { 00:27:23.678 "trtype": "PCIe", 00:27:23.678 "traddr": "0000:65:00.0" 00:27:23.678 }, 00:27:23.678 "ctrlr_data": { 00:27:23.678 "cntlid": 0, 00:27:23.678 "vendor_id": "0x8086", 00:27:23.678 "model_number": "INTEL SSDPE2KX020T8", 00:27:23.678 "serial_number": "PHLJ9512038S2P0BGN", 00:27:23.678 "firmware_revision": "VDV10184", 00:27:23.678 "oacs": { 00:27:23.678 "security": 0, 00:27:23.678 "format": 1, 00:27:23.678 "firmware": 1, 00:27:23.678 "ns_manage": 1 00:27:23.678 }, 00:27:23.678 "multi_ctrlr": false, 00:27:23.678 "ana_reporting": false 00:27:23.678 }, 00:27:23.678 "vs": { 00:27:23.678 "nvme_version": "1.2" 00:27:23.678 }, 00:27:23.678 "ns_data": { 00:27:23.678 "id": 1, 00:27:23.678 "can_share": false 00:27:23.678 } 00:27:23.678 } 00:27:23.678 ], 00:27:23.678 "mp_policy": "active_passive" 00:27:23.678 } 00:27:23.678 } 00:27:23.678 ] 00:27:23.939 10:22:45 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:23.939 10:22:45 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:24.883 b5d48c4a-3a2d-442d-998d-902aa273d925 00:27:25.144 10:22:46 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:25.145 7c2928d6-773b-48c4-b01e-5b493790cee8 00:27:25.145 10:22:46 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:25.145 10:22:46 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:25.405 10:22:47 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:25.667 [ 00:27:25.667 { 00:27:25.667 "name": "7c2928d6-773b-48c4-b01e-5b493790cee8", 00:27:25.667 "aliases": [ 00:27:25.667 "lvs0/lv0" 00:27:25.667 ], 00:27:25.667 "product_name": "Logical Volume", 00:27:25.667 "block_size": 512, 00:27:25.667 "num_blocks": 204800, 00:27:25.667 "uuid": "7c2928d6-773b-48c4-b01e-5b493790cee8", 00:27:25.667 "assigned_rate_limits": { 00:27:25.667 "rw_ios_per_sec": 0, 00:27:25.667 "rw_mbytes_per_sec": 0, 00:27:25.667 "r_mbytes_per_sec": 0, 00:27:25.667 "w_mbytes_per_sec": 0 00:27:25.667 }, 00:27:25.667 "claimed": false, 00:27:25.667 "zoned": false, 00:27:25.667 "supported_io_types": { 00:27:25.667 "read": true, 00:27:25.667 "write": true, 00:27:25.667 "unmap": true, 00:27:25.667 "write_zeroes": true, 00:27:25.667 "flush": false, 00:27:25.667 "reset": true, 00:27:25.667 "compare": false, 00:27:25.667 "compare_and_write": false, 00:27:25.667 "abort": false, 00:27:25.667 "nvme_admin": false, 00:27:25.667 "nvme_io": false 00:27:25.667 }, 00:27:25.667 "driver_specific": { 00:27:25.667 "lvol": { 00:27:25.667 "lvol_store_uuid": "b5d48c4a-3a2d-442d-998d-902aa273d925", 00:27:25.667 "base_bdev": "Nvme0n1", 00:27:25.667 "thin_provision": true, 00:27:25.667 "num_allocated_clusters": 0, 00:27:25.667 "snapshot": false, 00:27:25.667 "clone": false, 00:27:25.667 "esnap_clone": false 00:27:25.667 } 00:27:25.667 } 00:27:25.667 } 00:27:25.667 ] 00:27:25.667 10:22:47 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:25.667 10:22:47 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:25.667 10:22:47 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:25.927 [2024-06-10 10:22:47.546862] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:25.927 COMP_lvs0/lv0 00:27:25.927 10:22:47 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:25.927 10:22:47 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:26.188 [ 00:27:26.188 { 00:27:26.188 "name": "COMP_lvs0/lv0", 00:27:26.188 "aliases": [ 00:27:26.188 "fc47e732-1a6c-5368-856b-a63d34567b1c" 00:27:26.188 ], 00:27:26.188 "product_name": "compress", 00:27:26.188 "block_size": 4096, 00:27:26.188 "num_blocks": 25088, 00:27:26.188 "uuid": "fc47e732-1a6c-5368-856b-a63d34567b1c", 00:27:26.188 "assigned_rate_limits": { 00:27:26.188 "rw_ios_per_sec": 0, 00:27:26.188 "rw_mbytes_per_sec": 0, 00:27:26.188 "r_mbytes_per_sec": 0, 00:27:26.188 "w_mbytes_per_sec": 0 00:27:26.188 }, 00:27:26.188 "claimed": false, 00:27:26.188 "zoned": false, 00:27:26.188 "supported_io_types": { 00:27:26.188 "read": true, 00:27:26.188 "write": true, 00:27:26.188 "unmap": false, 00:27:26.188 "write_zeroes": true, 00:27:26.188 "flush": false, 00:27:26.188 "reset": false, 00:27:26.188 "compare": false, 00:27:26.188 "compare_and_write": false, 00:27:26.188 "abort": false, 00:27:26.188 "nvme_admin": false, 00:27:26.188 "nvme_io": false 00:27:26.188 }, 00:27:26.188 "driver_specific": { 00:27:26.188 "compress": { 00:27:26.188 "name": "COMP_lvs0/lv0", 00:27:26.188 "base_bdev_name": "7c2928d6-773b-48c4-b01e-5b493790cee8" 00:27:26.188 } 00:27:26.188 } 00:27:26.188 } 00:27:26.188 ] 00:27:26.188 10:22:47 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:26.188 10:22:47 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:26.449 Running I/O for 3 seconds... 00:27:29.750 00:27:29.750 Latency(us) 00:27:29.750 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:29.750 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:29.750 Verification LBA range: start 0x0 length 0x3100 00:27:29.750 COMP_lvs0/lv0 : 3.01 3136.79 12.25 0.00 0.00 10163.78 63.80 17140.18 00:27:29.750 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:29.750 Verification LBA range: start 0x3100 length 0x3100 00:27:29.750 COMP_lvs0/lv0 : 3.01 3132.24 12.24 0.00 0.00 10181.58 57.90 16837.71 00:27:29.750 =================================================================================================================== 00:27:29.750 Total : 6269.03 24.49 0.00 0.00 10172.68 57.90 17140.18 00:27:29.750 0 00:27:29.750 10:22:51 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:29.750 10:22:51 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:29.750 10:22:51 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:29.750 10:22:51 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:29.750 10:22:51 compress_isal -- compress/compress.sh@78 -- # killprocess 1148896 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1148896 ']' 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1148896 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@954 -- # uname 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1148896 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1148896' 00:27:29.750 killing process with pid 1148896 00:27:29.750 10:22:51 compress_isal -- common/autotest_common.sh@968 -- # kill 1148896 00:27:29.750 Received shutdown signal, test time was about 3.000000 seconds 00:27:29.750 00:27:29.750 Latency(us) 00:27:29.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:29.751 =================================================================================================================== 00:27:29.751 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:29.751 10:22:51 compress_isal -- common/autotest_common.sh@973 -- # wait 1148896 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1151055 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1151055 00:27:32.297 10:22:53 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1151055 ']' 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:32.297 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:32.297 10:22:53 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:32.297 [2024-06-10 10:22:54.030253] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:32.297 [2024-06-10 10:22:54.030315] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1151055 ] 00:27:32.297 [2024-06-10 10:22:54.098156] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:32.558 [2024-06-10 10:22:54.184620] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:32.558 [2024-06-10 10:22:54.184743] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:27:32.558 [2024-06-10 10:22:54.184746] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:33.131 10:22:54 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:33.131 10:22:54 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:27:33.131 10:22:54 compress_isal -- compress/compress.sh@58 -- # create_vols 00:27:33.131 10:22:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:33.131 10:22:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:36.435 10:22:57 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:36.435 10:22:57 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:36.435 10:22:58 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:36.696 [ 00:27:36.696 { 00:27:36.696 "name": "Nvme0n1", 00:27:36.696 "aliases": [ 00:27:36.696 "f796b7cc-5625-43f7-a409-4beec9a357df" 00:27:36.696 ], 00:27:36.696 "product_name": "NVMe disk", 00:27:36.696 "block_size": 512, 00:27:36.696 "num_blocks": 3907029168, 00:27:36.696 "uuid": "f796b7cc-5625-43f7-a409-4beec9a357df", 00:27:36.696 "assigned_rate_limits": { 00:27:36.696 "rw_ios_per_sec": 0, 00:27:36.696 "rw_mbytes_per_sec": 0, 00:27:36.696 "r_mbytes_per_sec": 0, 00:27:36.696 "w_mbytes_per_sec": 0 00:27:36.696 }, 00:27:36.696 "claimed": false, 00:27:36.696 "zoned": false, 00:27:36.696 "supported_io_types": { 00:27:36.696 "read": true, 00:27:36.696 "write": true, 00:27:36.696 "unmap": true, 00:27:36.696 "write_zeroes": true, 00:27:36.696 "flush": true, 00:27:36.696 "reset": true, 00:27:36.696 "compare": false, 00:27:36.696 "compare_and_write": false, 00:27:36.696 "abort": true, 00:27:36.696 "nvme_admin": true, 00:27:36.696 "nvme_io": true 00:27:36.696 }, 00:27:36.696 "driver_specific": { 00:27:36.696 "nvme": [ 00:27:36.696 { 00:27:36.696 "pci_address": "0000:65:00.0", 00:27:36.696 "trid": { 00:27:36.696 "trtype": "PCIe", 00:27:36.696 "traddr": "0000:65:00.0" 00:27:36.696 }, 00:27:36.696 "ctrlr_data": { 00:27:36.696 "cntlid": 0, 00:27:36.696 "vendor_id": "0x8086", 00:27:36.696 "model_number": "INTEL SSDPE2KX020T8", 00:27:36.696 "serial_number": "PHLJ9512038S2P0BGN", 00:27:36.696 "firmware_revision": "VDV10184", 00:27:36.696 "oacs": { 00:27:36.696 "security": 0, 00:27:36.696 "format": 1, 00:27:36.696 "firmware": 1, 00:27:36.696 "ns_manage": 1 00:27:36.696 }, 00:27:36.696 "multi_ctrlr": false, 00:27:36.696 "ana_reporting": false 00:27:36.696 }, 00:27:36.696 "vs": { 00:27:36.696 "nvme_version": "1.2" 00:27:36.696 }, 00:27:36.696 "ns_data": { 00:27:36.696 "id": 1, 00:27:36.696 "can_share": false 00:27:36.696 } 00:27:36.696 } 00:27:36.696 ], 00:27:36.696 "mp_policy": "active_passive" 00:27:36.696 } 00:27:36.696 } 00:27:36.696 ] 00:27:36.696 10:22:58 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:36.696 10:22:58 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:38.080 45b85aa2-fd7d-4f3e-a5f6-019c98d9c74f 00:27:38.080 10:22:59 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:38.081 ae26051f-8f50-49ec-ab0e-f1c63aff2d75 00:27:38.081 10:22:59 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:38.081 10:22:59 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.341 10:23:00 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:38.341 [ 00:27:38.341 { 00:27:38.341 "name": "ae26051f-8f50-49ec-ab0e-f1c63aff2d75", 00:27:38.341 "aliases": [ 00:27:38.341 "lvs0/lv0" 00:27:38.341 ], 00:27:38.341 "product_name": "Logical Volume", 00:27:38.341 "block_size": 512, 00:27:38.341 "num_blocks": 204800, 00:27:38.341 "uuid": "ae26051f-8f50-49ec-ab0e-f1c63aff2d75", 00:27:38.341 "assigned_rate_limits": { 00:27:38.341 "rw_ios_per_sec": 0, 00:27:38.341 "rw_mbytes_per_sec": 0, 00:27:38.341 "r_mbytes_per_sec": 0, 00:27:38.341 "w_mbytes_per_sec": 0 00:27:38.341 }, 00:27:38.341 "claimed": false, 00:27:38.341 "zoned": false, 00:27:38.341 "supported_io_types": { 00:27:38.341 "read": true, 00:27:38.341 "write": true, 00:27:38.341 "unmap": true, 00:27:38.341 "write_zeroes": true, 00:27:38.341 "flush": false, 00:27:38.341 "reset": true, 00:27:38.341 "compare": false, 00:27:38.341 "compare_and_write": false, 00:27:38.341 "abort": false, 00:27:38.341 "nvme_admin": false, 00:27:38.341 "nvme_io": false 00:27:38.341 }, 00:27:38.341 "driver_specific": { 00:27:38.341 "lvol": { 00:27:38.341 "lvol_store_uuid": "45b85aa2-fd7d-4f3e-a5f6-019c98d9c74f", 00:27:38.341 "base_bdev": "Nvme0n1", 00:27:38.341 "thin_provision": true, 00:27:38.341 "num_allocated_clusters": 0, 00:27:38.341 "snapshot": false, 00:27:38.341 "clone": false, 00:27:38.341 "esnap_clone": false 00:27:38.341 } 00:27:38.341 } 00:27:38.341 } 00:27:38.341 ] 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:38.601 10:23:00 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:38.601 10:23:00 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:38.601 [2024-06-10 10:23:00.393998] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:38.601 COMP_lvs0/lv0 00:27:38.601 10:23:00 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@900 -- # local i 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:38.601 10:23:00 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.861 10:23:00 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:39.122 [ 00:27:39.122 { 00:27:39.122 "name": "COMP_lvs0/lv0", 00:27:39.122 "aliases": [ 00:27:39.122 "b72aacc8-e438-5442-b74a-52b92d977842" 00:27:39.122 ], 00:27:39.122 "product_name": "compress", 00:27:39.122 "block_size": 512, 00:27:39.122 "num_blocks": 200704, 00:27:39.122 "uuid": "b72aacc8-e438-5442-b74a-52b92d977842", 00:27:39.122 "assigned_rate_limits": { 00:27:39.122 "rw_ios_per_sec": 0, 00:27:39.122 "rw_mbytes_per_sec": 0, 00:27:39.122 "r_mbytes_per_sec": 0, 00:27:39.122 "w_mbytes_per_sec": 0 00:27:39.122 }, 00:27:39.122 "claimed": false, 00:27:39.122 "zoned": false, 00:27:39.122 "supported_io_types": { 00:27:39.122 "read": true, 00:27:39.122 "write": true, 00:27:39.122 "unmap": false, 00:27:39.122 "write_zeroes": true, 00:27:39.122 "flush": false, 00:27:39.122 "reset": false, 00:27:39.122 "compare": false, 00:27:39.122 "compare_and_write": false, 00:27:39.122 "abort": false, 00:27:39.122 "nvme_admin": false, 00:27:39.122 "nvme_io": false 00:27:39.122 }, 00:27:39.122 "driver_specific": { 00:27:39.122 "compress": { 00:27:39.122 "name": "COMP_lvs0/lv0", 00:27:39.122 "base_bdev_name": "ae26051f-8f50-49ec-ab0e-f1c63aff2d75" 00:27:39.122 } 00:27:39.122 } 00:27:39.122 } 00:27:39.122 ] 00:27:39.122 10:23:00 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:27:39.123 10:23:00 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:39.123 I/O targets: 00:27:39.123 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:27:39.123 00:27:39.123 00:27:39.123 CUnit - A unit testing framework for C - Version 2.1-3 00:27:39.123 http://cunit.sourceforge.net/ 00:27:39.123 00:27:39.123 00:27:39.123 Suite: bdevio tests on: COMP_lvs0/lv0 00:27:39.123 Test: blockdev write read block ...passed 00:27:39.123 Test: blockdev write zeroes read block ...passed 00:27:39.123 Test: blockdev write zeroes read no split ...passed 00:27:39.123 Test: blockdev write zeroes read split ...passed 00:27:39.123 Test: blockdev write zeroes read split partial ...passed 00:27:39.123 Test: blockdev reset ...[2024-06-10 10:23:00.932387] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:27:39.123 passed 00:27:39.123 Test: blockdev write read 8 blocks ...passed 00:27:39.123 Test: blockdev write read size > 128k ...passed 00:27:39.123 Test: blockdev write read invalid size ...passed 00:27:39.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:39.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:39.123 Test: blockdev write read max offset ...passed 00:27:39.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:39.123 Test: blockdev writev readv 8 blocks ...passed 00:27:39.123 Test: blockdev writev readv 30 x 1block ...passed 00:27:39.123 Test: blockdev writev readv block ...passed 00:27:39.123 Test: blockdev writev readv size > 128k ...passed 00:27:39.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:39.123 Test: blockdev comparev and writev ...passed 00:27:39.123 Test: blockdev nvme passthru rw ...passed 00:27:39.123 Test: blockdev nvme passthru vendor specific ...passed 00:27:39.123 Test: blockdev nvme admin passthru ...passed 00:27:39.123 Test: blockdev copy ...passed 00:27:39.123 00:27:39.123 Run Summary: Type Total Ran Passed Failed Inactive 00:27:39.123 suites 1 1 n/a 0 0 00:27:39.123 tests 23 23 23 0 0 00:27:39.123 asserts 130 130 130 0 n/a 00:27:39.123 00:27:39.123 Elapsed time = 0.159 seconds 00:27:39.123 0 00:27:39.123 10:23:00 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:27:39.123 10:23:00 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:39.382 10:23:01 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:39.641 10:23:01 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:27:39.641 10:23:01 compress_isal -- compress/compress.sh@62 -- # killprocess 1151055 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1151055 ']' 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1151055 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@954 -- # uname 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1151055 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1151055' 00:27:39.641 killing process with pid 1151055 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@968 -- # kill 1151055 00:27:39.641 10:23:01 compress_isal -- common/autotest_common.sh@973 -- # wait 1151055 00:27:42.188 10:23:03 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:27:42.188 10:23:03 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:27:42.188 00:27:42.188 real 0m47.820s 00:27:42.188 user 1m49.422s 00:27:42.188 sys 0m2.831s 00:27:42.188 10:23:03 compress_isal -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:42.188 10:23:03 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:42.188 ************************************ 00:27:42.188 END TEST compress_isal 00:27:42.188 ************************************ 00:27:42.188 10:23:03 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:27:42.188 10:23:03 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:27:42.188 10:23:03 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:27:42.188 10:23:03 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:42.188 10:23:03 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:42.188 10:23:03 -- common/autotest_common.sh@10 -- # set +x 00:27:42.188 ************************************ 00:27:42.188 START TEST blockdev_crypto_aesni 00:27:42.188 ************************************ 00:27:42.188 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:27:42.188 * Looking for test storage... 00:27:42.188 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:42.188 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:42.188 10:23:03 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:27:42.188 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:42.188 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:42.188 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1152674 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1152674 00:27:42.189 10:23:03 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@830 -- # '[' -z 1152674 ']' 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:42.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:42.189 10:23:03 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:42.189 [2024-06-10 10:23:03.935469] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:42.189 [2024-06-10 10:23:03.935533] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1152674 ] 00:27:42.189 [2024-06-10 10:23:04.025241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:42.450 [2024-06-10 10:23:04.093761] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:43.021 10:23:04 blockdev_crypto_aesni -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:43.021 10:23:04 blockdev_crypto_aesni -- common/autotest_common.sh@863 -- # return 0 00:27:43.021 10:23:04 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:43.021 10:23:04 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:27:43.021 10:23:04 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:27:43.021 10:23:04 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:43.021 10:23:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:43.021 [2024-06-10 10:23:04.779722] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:43.021 [2024-06-10 10:23:04.787755] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:43.021 [2024-06-10 10:23:04.795769] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:43.021 [2024-06-10 10:23:04.843621] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:45.569 true 00:27:45.569 true 00:27:45.569 true 00:27:45.569 true 00:27:45.569 Malloc0 00:27:45.569 Malloc1 00:27:45.569 Malloc2 00:27:45.569 Malloc3 00:27:45.569 [2024-06-10 10:23:07.117407] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:45.569 crypto_ram 00:27:45.569 [2024-06-10 10:23:07.125426] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:45.569 crypto_ram2 00:27:45.569 [2024-06-10 10:23:07.133447] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:45.569 crypto_ram3 00:27:45.569 [2024-06-10 10:23:07.141470] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:45.569 crypto_ram4 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6f1be63c-3dc3-501a-9da6-48183f3f33ea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6f1be63c-3dc3-501a-9da6-48183f3f33ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7ca639a5-6a38-5454-8d19-e9bc54e4f983"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7ca639a5-6a38-5454-8d19-e9bc54e4f983",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e8b373ef-b44f-5f9d-851c-8a4ec91edf02"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e8b373ef-b44f-5f9d-851c-8a4ec91edf02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "86e5f972-a89d-5fe3-9733-595c58e7feb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "86e5f972-a89d-5fe3-9733-595c58e7feb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:45.569 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1152674 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@949 -- # '[' -z 1152674 ']' 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # kill -0 1152674 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # uname 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1152674 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1152674' 00:27:45.569 killing process with pid 1152674 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # kill 1152674 00:27:45.569 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@973 -- # wait 1152674 00:27:45.831 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:45.831 10:23:07 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:45.831 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:27:45.831 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:45.831 10:23:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:46.092 ************************************ 00:27:46.092 START TEST bdev_hello_world 00:27:46.092 ************************************ 00:27:46.092 10:23:07 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:46.092 [2024-06-10 10:23:07.757604] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:46.092 [2024-06-10 10:23:07.757652] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1153311 ] 00:27:46.092 [2024-06-10 10:23:07.846243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:46.092 [2024-06-10 10:23:07.914298] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:46.092 [2024-06-10 10:23:07.935298] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:46.092 [2024-06-10 10:23:07.943327] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:46.092 [2024-06-10 10:23:07.951356] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:46.353 [2024-06-10 10:23:08.043987] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:48.895 [2024-06-10 10:23:10.207198] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:48.895 [2024-06-10 10:23:10.207255] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:48.895 [2024-06-10 10:23:10.207263] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.895 [2024-06-10 10:23:10.215216] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:48.895 [2024-06-10 10:23:10.215228] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:48.895 [2024-06-10 10:23:10.215233] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.895 [2024-06-10 10:23:10.223235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:48.895 [2024-06-10 10:23:10.223246] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:48.895 [2024-06-10 10:23:10.223251] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.895 [2024-06-10 10:23:10.231257] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:48.895 [2024-06-10 10:23:10.231268] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:48.895 [2024-06-10 10:23:10.231274] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.895 [2024-06-10 10:23:10.292437] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:48.895 [2024-06-10 10:23:10.292468] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:48.895 [2024-06-10 10:23:10.292478] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:48.895 [2024-06-10 10:23:10.293507] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:48.895 [2024-06-10 10:23:10.293566] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:48.895 [2024-06-10 10:23:10.293576] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:48.895 [2024-06-10 10:23:10.293608] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:48.895 00:27:48.895 [2024-06-10 10:23:10.293620] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:48.895 00:27:48.895 real 0m2.820s 00:27:48.895 user 0m2.540s 00:27:48.895 sys 0m0.243s 00:27:48.895 10:23:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:48.895 10:23:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:48.895 ************************************ 00:27:48.895 END TEST bdev_hello_world 00:27:48.895 ************************************ 00:27:48.895 10:23:10 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:48.895 10:23:10 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:48.895 10:23:10 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:48.895 10:23:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.895 ************************************ 00:27:48.895 START TEST bdev_bounds 00:27:48.895 ************************************ 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1153909 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1153909' 00:27:48.896 Process bdevio pid: 1153909 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1153909 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1153909 ']' 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:48.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:48.896 10:23:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:48.896 [2024-06-10 10:23:10.651388] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:48.896 [2024-06-10 10:23:10.651437] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1153909 ] 00:27:48.896 [2024-06-10 10:23:10.739084] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:49.168 [2024-06-10 10:23:10.805191] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:49.168 [2024-06-10 10:23:10.805301] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:27:49.168 [2024-06-10 10:23:10.805304] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.168 [2024-06-10 10:23:10.826332] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:49.168 [2024-06-10 10:23:10.834360] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:49.168 [2024-06-10 10:23:10.842377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:49.168 [2024-06-10 10:23:10.927040] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:51.786 [2024-06-10 10:23:13.089487] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:51.786 [2024-06-10 10:23:13.089538] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:51.786 [2024-06-10 10:23:13.089547] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 [2024-06-10 10:23:13.097504] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:51.786 [2024-06-10 10:23:13.097517] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:51.786 [2024-06-10 10:23:13.097523] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 [2024-06-10 10:23:13.105524] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:51.786 [2024-06-10 10:23:13.105534] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:51.786 [2024-06-10 10:23:13.105540] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 [2024-06-10 10:23:13.113545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:51.786 [2024-06-10 10:23:13.113555] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:51.786 [2024-06-10 10:23:13.113561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:51.786 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:27:51.786 10:23:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:51.786 I/O targets: 00:27:51.786 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:27:51.786 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:27:51.786 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:27:51.786 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:27:51.786 00:27:51.786 00:27:51.786 CUnit - A unit testing framework for C - Version 2.1-3 00:27:51.786 http://cunit.sourceforge.net/ 00:27:51.786 00:27:51.786 00:27:51.786 Suite: bdevio tests on: crypto_ram4 00:27:51.786 Test: blockdev write read block ...passed 00:27:51.786 Test: blockdev write zeroes read block ...passed 00:27:51.786 Test: blockdev write zeroes read no split ...passed 00:27:51.786 Test: blockdev write zeroes read split ...passed 00:27:51.786 Test: blockdev write zeroes read split partial ...passed 00:27:51.786 Test: blockdev reset ...passed 00:27:51.786 Test: blockdev write read 8 blocks ...passed 00:27:51.786 Test: blockdev write read size > 128k ...passed 00:27:51.786 Test: blockdev write read invalid size ...passed 00:27:51.786 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.786 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.786 Test: blockdev write read max offset ...passed 00:27:51.786 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.786 Test: blockdev writev readv 8 blocks ...passed 00:27:51.786 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.786 Test: blockdev writev readv block ...passed 00:27:51.786 Test: blockdev writev readv size > 128k ...passed 00:27:51.786 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.786 Test: blockdev comparev and writev ...passed 00:27:51.786 Test: blockdev nvme passthru rw ...passed 00:27:51.786 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.786 Test: blockdev nvme admin passthru ...passed 00:27:51.786 Test: blockdev copy ...passed 00:27:51.786 Suite: bdevio tests on: crypto_ram3 00:27:51.786 Test: blockdev write read block ...passed 00:27:51.786 Test: blockdev write zeroes read block ...passed 00:27:51.786 Test: blockdev write zeroes read no split ...passed 00:27:51.786 Test: blockdev write zeroes read split ...passed 00:27:51.786 Test: blockdev write zeroes read split partial ...passed 00:27:51.786 Test: blockdev reset ...passed 00:27:51.786 Test: blockdev write read 8 blocks ...passed 00:27:51.786 Test: blockdev write read size > 128k ...passed 00:27:51.786 Test: blockdev write read invalid size ...passed 00:27:51.786 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.786 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.786 Test: blockdev write read max offset ...passed 00:27:51.786 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.786 Test: blockdev writev readv 8 blocks ...passed 00:27:51.786 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.786 Test: blockdev writev readv block ...passed 00:27:51.786 Test: blockdev writev readv size > 128k ...passed 00:27:51.787 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.787 Test: blockdev comparev and writev ...passed 00:27:51.787 Test: blockdev nvme passthru rw ...passed 00:27:51.787 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.787 Test: blockdev nvme admin passthru ...passed 00:27:51.787 Test: blockdev copy ...passed 00:27:51.787 Suite: bdevio tests on: crypto_ram2 00:27:51.787 Test: blockdev write read block ...passed 00:27:51.787 Test: blockdev write zeroes read block ...passed 00:27:51.787 Test: blockdev write zeroes read no split ...passed 00:27:51.787 Test: blockdev write zeroes read split ...passed 00:27:51.787 Test: blockdev write zeroes read split partial ...passed 00:27:51.787 Test: blockdev reset ...passed 00:27:51.787 Test: blockdev write read 8 blocks ...passed 00:27:51.787 Test: blockdev write read size > 128k ...passed 00:27:51.787 Test: blockdev write read invalid size ...passed 00:27:51.787 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.787 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.787 Test: blockdev write read max offset ...passed 00:27:51.787 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.787 Test: blockdev writev readv 8 blocks ...passed 00:27:51.787 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.787 Test: blockdev writev readv block ...passed 00:27:51.787 Test: blockdev writev readv size > 128k ...passed 00:27:51.787 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.787 Test: blockdev comparev and writev ...passed 00:27:51.787 Test: blockdev nvme passthru rw ...passed 00:27:51.787 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.787 Test: blockdev nvme admin passthru ...passed 00:27:51.787 Test: blockdev copy ...passed 00:27:51.787 Suite: bdevio tests on: crypto_ram 00:27:51.787 Test: blockdev write read block ...passed 00:27:51.787 Test: blockdev write zeroes read block ...passed 00:27:51.787 Test: blockdev write zeroes read no split ...passed 00:27:51.787 Test: blockdev write zeroes read split ...passed 00:27:51.787 Test: blockdev write zeroes read split partial ...passed 00:27:51.787 Test: blockdev reset ...passed 00:27:51.787 Test: blockdev write read 8 blocks ...passed 00:27:51.787 Test: blockdev write read size > 128k ...passed 00:27:51.787 Test: blockdev write read invalid size ...passed 00:27:51.787 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.787 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.787 Test: blockdev write read max offset ...passed 00:27:51.787 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.787 Test: blockdev writev readv 8 blocks ...passed 00:27:51.787 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.787 Test: blockdev writev readv block ...passed 00:27:51.787 Test: blockdev writev readv size > 128k ...passed 00:27:51.787 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.787 Test: blockdev comparev and writev ...passed 00:27:51.787 Test: blockdev nvme passthru rw ...passed 00:27:51.787 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.787 Test: blockdev nvme admin passthru ...passed 00:27:51.787 Test: blockdev copy ...passed 00:27:51.787 00:27:51.787 Run Summary: Type Total Ran Passed Failed Inactive 00:27:51.787 suites 4 4 n/a 0 0 00:27:51.787 tests 92 92 92 0 0 00:27:51.787 asserts 520 520 520 0 n/a 00:27:51.787 00:27:51.787 Elapsed time = 0.494 seconds 00:27:51.787 0 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1153909 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1153909 ']' 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1153909 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1153909 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1153909' 00:27:51.787 killing process with pid 1153909 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1153909 00:27:51.787 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1153909 00:27:52.047 10:23:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:27:52.047 00:27:52.047 real 0m3.241s 00:27:52.047 user 0m9.263s 00:27:52.047 sys 0m0.404s 00:27:52.047 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:52.047 10:23:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:52.047 ************************************ 00:27:52.047 END TEST bdev_bounds 00:27:52.047 ************************************ 00:27:52.047 10:23:13 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:52.047 10:23:13 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:52.047 10:23:13 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:52.047 10:23:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:52.308 ************************************ 00:27:52.308 START TEST bdev_nbd 00:27:52.308 ************************************ 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1154544 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1154544 /var/tmp/spdk-nbd.sock 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1154544 ']' 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:52.308 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:52.308 10:23:13 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:52.308 [2024-06-10 10:23:13.976491] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:27:52.308 [2024-06-10 10:23:13.976536] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:52.308 [2024-06-10 10:23:14.062951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:52.308 [2024-06-10 10:23:14.125114] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:52.308 [2024-06-10 10:23:14.146112] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:52.308 [2024-06-10 10:23:14.154133] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:52.308 [2024-06-10 10:23:14.162150] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:52.569 [2024-06-10 10:23:14.248026] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:55.115 [2024-06-10 10:23:16.413102] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:55.115 [2024-06-10 10:23:16.413153] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:55.115 [2024-06-10 10:23:16.413161] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.115 [2024-06-10 10:23:16.421120] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:55.115 [2024-06-10 10:23:16.421133] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:55.115 [2024-06-10 10:23:16.421138] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.115 [2024-06-10 10:23:16.429140] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:55.115 [2024-06-10 10:23:16.429151] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:55.115 [2024-06-10 10:23:16.429156] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.115 [2024-06-10 10:23:16.437159] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:55.115 [2024-06-10 10:23:16.437169] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:55.115 [2024-06-10 10:23:16.437175] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:55.115 1+0 records in 00:27:55.115 1+0 records out 00:27:55.115 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307501 s, 13.3 MB/s 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:55.115 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:55.116 1+0 records in 00:27:55.116 1+0 records out 00:27:55.116 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299887 s, 13.7 MB/s 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:55.116 10:23:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:55.377 1+0 records in 00:27:55.377 1+0 records out 00:27:55.377 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307142 s, 13.3 MB/s 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:55.377 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:55.638 1+0 records in 00:27:55.638 1+0 records out 00:27:55.638 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000179499 s, 22.8 MB/s 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:55.638 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:55.899 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd0", 00:27:55.899 "bdev_name": "crypto_ram" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd1", 00:27:55.899 "bdev_name": "crypto_ram2" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd2", 00:27:55.899 "bdev_name": "crypto_ram3" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd3", 00:27:55.899 "bdev_name": "crypto_ram4" 00:27:55.899 } 00:27:55.899 ]' 00:27:55.899 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:55.899 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd0", 00:27:55.899 "bdev_name": "crypto_ram" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd1", 00:27:55.899 "bdev_name": "crypto_ram2" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd2", 00:27:55.899 "bdev_name": "crypto_ram3" 00:27:55.899 }, 00:27:55.899 { 00:27:55.899 "nbd_device": "/dev/nbd3", 00:27:55.899 "bdev_name": "crypto_ram4" 00:27:55.899 } 00:27:55.899 ]' 00:27:55.899 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:55.899 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:55.900 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:56.160 10:23:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:27:56.421 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:56.422 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:56.422 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:56.422 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.683 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:56.943 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:56.944 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:56.944 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:57.205 /dev/nbd0 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:57.205 1+0 records in 00:27:57.205 1+0 records out 00:27:57.205 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262292 s, 15.6 MB/s 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:57.205 10:23:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:27:57.466 /dev/nbd1 00:27:57.466 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:57.466 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:57.466 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:27:57.466 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:57.467 1+0 records in 00:27:57.467 1+0 records out 00:27:57.467 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274563 s, 14.9 MB/s 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:57.467 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:27:57.728 /dev/nbd10 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:57.728 1+0 records in 00:27:57.728 1+0 records out 00:27:57.728 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271999 s, 15.1 MB/s 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:57.728 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:27:57.989 /dev/nbd11 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:57.989 1+0 records in 00:27:57.989 1+0 records out 00:27:57.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266982 s, 15.3 MB/s 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:57.989 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:57.989 { 00:27:57.989 "nbd_device": "/dev/nbd0", 00:27:57.989 "bdev_name": "crypto_ram" 00:27:57.989 }, 00:27:57.989 { 00:27:57.989 "nbd_device": "/dev/nbd1", 00:27:57.989 "bdev_name": "crypto_ram2" 00:27:57.989 }, 00:27:57.989 { 00:27:57.989 "nbd_device": "/dev/nbd10", 00:27:57.989 "bdev_name": "crypto_ram3" 00:27:57.989 }, 00:27:57.989 { 00:27:57.989 "nbd_device": "/dev/nbd11", 00:27:57.990 "bdev_name": "crypto_ram4" 00:27:57.990 } 00:27:57.990 ]' 00:27:57.990 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:57.990 { 00:27:57.990 "nbd_device": "/dev/nbd0", 00:27:57.990 "bdev_name": "crypto_ram" 00:27:57.990 }, 00:27:57.990 { 00:27:57.990 "nbd_device": "/dev/nbd1", 00:27:57.990 "bdev_name": "crypto_ram2" 00:27:57.990 }, 00:27:57.990 { 00:27:57.990 "nbd_device": "/dev/nbd10", 00:27:57.990 "bdev_name": "crypto_ram3" 00:27:57.990 }, 00:27:57.990 { 00:27:57.990 "nbd_device": "/dev/nbd11", 00:27:57.990 "bdev_name": "crypto_ram4" 00:27:57.990 } 00:27:57.990 ]' 00:27:57.990 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:58.251 /dev/nbd1 00:27:58.251 /dev/nbd10 00:27:58.251 /dev/nbd11' 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:58.251 /dev/nbd1 00:27:58.251 /dev/nbd10 00:27:58.251 /dev/nbd11' 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:58.251 256+0 records in 00:27:58.251 256+0 records out 00:27:58.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115236 s, 91.0 MB/s 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:58.251 256+0 records in 00:27:58.251 256+0 records out 00:27:58.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0509511 s, 20.6 MB/s 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:58.251 256+0 records in 00:27:58.251 256+0 records out 00:27:58.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0544222 s, 19.3 MB/s 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:27:58.251 256+0 records in 00:27:58.251 256+0 records out 00:27:58.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0413719 s, 25.3 MB/s 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:27:58.251 256+0 records in 00:27:58.251 256+0 records out 00:27:58.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0367126 s, 28.6 MB/s 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:58.251 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:58.511 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:58.512 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:58.772 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:59.032 10:23:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:59.292 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:59.293 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:59.293 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:59.293 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:59.554 malloc_lvol_verify 00:27:59.554 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:59.814 f4be9dee-d306-416a-93ad-08c9fa276bd9 00:27:59.814 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:00.073 8ce1b061-a22b-4846-ae9a-bfcf85465051 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:00.074 /dev/nbd0 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:00.074 mke2fs 1.46.5 (30-Dec-2021) 00:28:00.074 Discarding device blocks: 0/4096 done 00:28:00.074 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:00.074 00:28:00.074 Allocating group tables: 0/1 done 00:28:00.074 Writing inode tables: 0/1 done 00:28:00.074 Creating journal (1024 blocks): done 00:28:00.074 Writing superblocks and filesystem accounting information: 0/1 done 00:28:00.074 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:00.074 10:23:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1154544 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1154544 ']' 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1154544 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1154544 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1154544' 00:28:00.335 killing process with pid 1154544 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1154544 00:28:00.335 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1154544 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:00.599 00:28:00.599 real 0m8.480s 00:28:00.599 user 0m11.763s 00:28:00.599 sys 0m2.264s 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:00.599 ************************************ 00:28:00.599 END TEST bdev_nbd 00:28:00.599 ************************************ 00:28:00.599 10:23:22 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:00.599 10:23:22 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:28:00.599 10:23:22 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:28:00.599 10:23:22 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:00.599 10:23:22 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:00.599 10:23:22 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:00.599 10:23:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:00.599 ************************************ 00:28:00.599 START TEST bdev_fio 00:28:00.599 ************************************ 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:00.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:00.599 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:00.861 ************************************ 00:28:00.861 START TEST bdev_fio_rw_verify 00:28:00.861 ************************************ 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:00.861 10:23:22 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:01.122 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:01.122 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:01.122 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:01.122 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:01.122 fio-3.35 00:28:01.122 Starting 4 threads 00:28:16.027 00:28:16.027 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1156729: Mon Jun 10 10:23:35 2024 00:28:16.027 read: IOPS=37.2k, BW=145MiB/s (152MB/s)(1452MiB/10001msec) 00:28:16.027 slat (usec): min=14, max=439, avg=34.11, stdev=20.14 00:28:16.027 clat (usec): min=8, max=904, avg=184.50, stdev=117.59 00:28:16.027 lat (usec): min=23, max=1014, avg=218.61, stdev=128.32 00:28:16.027 clat percentiles (usec): 00:28:16.027 | 50.000th=[ 157], 99.000th=[ 562], 99.900th=[ 709], 99.990th=[ 791], 00:28:16.027 | 99.999th=[ 857] 00:28:16.027 write: IOPS=41.0k, BW=160MiB/s (168MB/s)(1558MiB/9724msec); 0 zone resets 00:28:16.027 slat (usec): min=14, max=773, avg=43.72, stdev=20.14 00:28:16.027 clat (usec): min=23, max=2180, avg=246.50, stdev=150.09 00:28:16.027 lat (usec): min=49, max=2394, avg=290.22, stdev=161.41 00:28:16.027 clat percentiles (usec): 00:28:16.027 | 50.000th=[ 221], 99.000th=[ 709], 99.900th=[ 873], 99.990th=[ 1352], 00:28:16.027 | 99.999th=[ 2089] 00:28:16.027 bw ( KiB/s): min=135520, max=179824, per=97.47%, avg=159868.63, stdev=3306.68, samples=76 00:28:16.027 iops : min=33880, max=44956, avg=39967.16, stdev=826.67, samples=76 00:28:16.027 lat (usec) : 10=0.01%, 20=0.01%, 50=4.52%, 100=15.28%, 250=48.90% 00:28:16.027 lat (usec) : 500=26.12%, 750=4.83%, 1000=0.32% 00:28:16.027 lat (msec) : 2=0.01%, 4=0.01% 00:28:16.027 cpu : usr=99.75%, sys=0.00%, ctx=67, majf=0, minf=270 00:28:16.027 IO depths : 1=10.2%, 2=23.4%, 4=53.0%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:16.027 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:16.027 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:16.027 issued rwts: total=371747,398726,0,0 short=0,0,0,0 dropped=0,0,0,0 00:28:16.027 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:16.027 00:28:16.027 Run status group 0 (all jobs): 00:28:16.027 READ: bw=145MiB/s (152MB/s), 145MiB/s-145MiB/s (152MB/s-152MB/s), io=1452MiB (1523MB), run=10001-10001msec 00:28:16.027 WRITE: bw=160MiB/s (168MB/s), 160MiB/s-160MiB/s (168MB/s-168MB/s), io=1558MiB (1633MB), run=9724-9724msec 00:28:16.027 00:28:16.027 real 0m13.286s 00:28:16.027 user 0m53.703s 00:28:16.027 sys 0m0.346s 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:28:16.027 ************************************ 00:28:16.027 END TEST bdev_fio_rw_verify 00:28:16.027 ************************************ 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:28:16.027 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6f1be63c-3dc3-501a-9da6-48183f3f33ea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6f1be63c-3dc3-501a-9da6-48183f3f33ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7ca639a5-6a38-5454-8d19-e9bc54e4f983"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7ca639a5-6a38-5454-8d19-e9bc54e4f983",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e8b373ef-b44f-5f9d-851c-8a4ec91edf02"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e8b373ef-b44f-5f9d-851c-8a4ec91edf02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "86e5f972-a89d-5fe3-9733-595c58e7feb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "86e5f972-a89d-5fe3-9733-595c58e7feb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:28:16.028 crypto_ram2 00:28:16.028 crypto_ram3 00:28:16.028 crypto_ram4 ]] 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6f1be63c-3dc3-501a-9da6-48183f3f33ea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6f1be63c-3dc3-501a-9da6-48183f3f33ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7ca639a5-6a38-5454-8d19-e9bc54e4f983"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7ca639a5-6a38-5454-8d19-e9bc54e4f983",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e8b373ef-b44f-5f9d-851c-8a4ec91edf02"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e8b373ef-b44f-5f9d-851c-8a4ec91edf02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "86e5f972-a89d-5fe3-9733-595c58e7feb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "86e5f972-a89d-5fe3-9733-595c58e7feb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:16.028 10:23:35 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:16.028 ************************************ 00:28:16.028 START TEST bdev_fio_trim 00:28:16.028 ************************************ 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:16.028 10:23:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:16.028 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:16.028 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:16.028 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:16.028 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:16.028 fio-3.35 00:28:16.028 Starting 4 threads 00:28:28.253 00:28:28.253 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1159037: Mon Jun 10 10:23:49 2024 00:28:28.253 write: IOPS=60.6k, BW=237MiB/s (248MB/s)(2369MiB/10001msec); 0 zone resets 00:28:28.253 slat (usec): min=13, max=475, avg=40.08, stdev=24.39 00:28:28.253 clat (usec): min=40, max=1400, avg=187.88, stdev=124.97 00:28:28.253 lat (usec): min=55, max=1514, avg=227.96, stdev=141.53 00:28:28.253 clat percentiles (usec): 00:28:28.253 | 50.000th=[ 147], 99.000th=[ 519], 99.900th=[ 619], 99.990th=[ 914], 00:28:28.253 | 99.999th=[ 1270] 00:28:28.253 bw ( KiB/s): min=231248, max=284240, per=100.00%, avg=243008.00, stdev=4501.58, samples=76 00:28:28.253 iops : min=57812, max=71060, avg=60752.00, stdev=1125.39, samples=76 00:28:28.253 trim: IOPS=60.6k, BW=237MiB/s (248MB/s)(2369MiB/10001msec); 0 zone resets 00:28:28.253 slat (nsec): min=4775, max=56200, avg=7935.89, stdev=3887.57 00:28:28.253 clat (usec): min=55, max=1045, avg=162.17, stdev=67.95 00:28:28.253 lat (usec): min=60, max=1053, avg=170.10, stdev=68.64 00:28:28.253 clat percentiles (usec): 00:28:28.253 | 50.000th=[ 151], 99.000th=[ 355], 99.900th=[ 424], 99.990th=[ 586], 00:28:28.253 | 99.999th=[ 848] 00:28:28.253 bw ( KiB/s): min=231248, max=284240, per=100.00%, avg=243010.53, stdev=4501.85, samples=76 00:28:28.253 iops : min=57812, max=71060, avg=60752.63, stdev=1125.46, samples=76 00:28:28.253 lat (usec) : 50=3.23%, 100=20.64%, 250=57.23%, 500=18.12%, 750=0.76% 00:28:28.253 lat (usec) : 1000=0.01% 00:28:28.253 lat (msec) : 2=0.01% 00:28:28.253 cpu : usr=99.73%, sys=0.00%, ctx=60, majf=0, minf=89 00:28:28.253 IO depths : 1=8.1%, 2=22.1%, 4=55.9%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:28.253 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:28.253 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:28.253 issued rwts: total=0,606466,606466,0 short=0,0,0,0 dropped=0,0,0,0 00:28:28.253 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:28.253 00:28:28.253 Run status group 0 (all jobs): 00:28:28.253 WRITE: bw=237MiB/s (248MB/s), 237MiB/s-237MiB/s (248MB/s-248MB/s), io=2369MiB (2484MB), run=10001-10001msec 00:28:28.253 TRIM: bw=237MiB/s (248MB/s), 237MiB/s-237MiB/s (248MB/s-248MB/s), io=2369MiB (2484MB), run=10001-10001msec 00:28:28.253 00:28:28.253 real 0m13.364s 00:28:28.253 user 0m52.715s 00:28:28.253 sys 0m0.422s 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:28.253 ************************************ 00:28:28.253 END TEST bdev_fio_trim 00:28:28.253 ************************************ 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:28.253 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:28.253 00:28:28.253 real 0m26.949s 00:28:28.253 user 1m46.585s 00:28:28.253 sys 0m0.915s 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:28.253 ************************************ 00:28:28.253 END TEST bdev_fio 00:28:28.253 ************************************ 00:28:28.253 10:23:49 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:28.253 10:23:49 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:28.253 10:23:49 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:28:28.253 10:23:49 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:28.253 10:23:49 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:28.253 ************************************ 00:28:28.253 START TEST bdev_verify 00:28:28.253 ************************************ 00:28:28.253 10:23:49 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:28.253 [2024-06-10 10:23:49.545649] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:28.253 [2024-06-10 10:23:49.545716] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1160875 ] 00:28:28.253 [2024-06-10 10:23:49.639202] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:28.253 [2024-06-10 10:23:49.734188] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:28.253 [2024-06-10 10:23:49.734194] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:28.253 [2024-06-10 10:23:49.755391] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:28.253 [2024-06-10 10:23:49.763417] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:28.253 [2024-06-10 10:23:49.771431] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:28.253 [2024-06-10 10:23:49.862190] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:30.248 [2024-06-10 10:23:52.027157] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:30.248 [2024-06-10 10:23:52.027219] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:30.248 [2024-06-10 10:23:52.027228] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.248 [2024-06-10 10:23:52.035173] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:30.248 [2024-06-10 10:23:52.035185] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:30.248 [2024-06-10 10:23:52.035190] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.248 [2024-06-10 10:23:52.043193] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:30.248 [2024-06-10 10:23:52.043204] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:30.248 [2024-06-10 10:23:52.043210] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.248 [2024-06-10 10:23:52.051213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:30.248 [2024-06-10 10:23:52.051224] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:30.248 [2024-06-10 10:23:52.051230] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.508 Running I/O for 5 seconds... 00:28:35.787 00:28:35.787 Latency(us) 00:28:35.787 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:35.787 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x0 length 0x1000 00:28:35.787 crypto_ram : 5.06 538.79 2.10 0.00 0.00 236490.61 1739.22 177451.32 00:28:35.787 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x1000 length 0x1000 00:28:35.787 crypto_ram : 5.06 538.93 2.11 0.00 0.00 236391.89 2104.71 177451.32 00:28:35.787 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x0 length 0x1000 00:28:35.787 crypto_ram2 : 5.06 541.67 2.12 0.00 0.00 234776.70 3201.18 162932.58 00:28:35.787 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x1000 length 0x1000 00:28:35.787 crypto_ram2 : 5.06 543.48 2.12 0.00 0.00 234083.88 2129.92 162932.58 00:28:35.787 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x0 length 0x1000 00:28:35.787 crypto_ram3 : 5.05 4235.37 16.54 0.00 0.00 29927.79 5242.88 32667.18 00:28:35.787 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x1000 length 0x1000 00:28:35.787 crypto_ram3 : 5.04 4238.07 16.55 0.00 0.00 29925.33 5469.74 32667.18 00:28:35.787 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x0 length 0x1000 00:28:35.787 crypto_ram4 : 5.05 4252.55 16.61 0.00 0.00 29753.22 1701.42 31255.63 00:28:35.787 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:35.787 Verification LBA range: start 0x1000 length 0x1000 00:28:35.787 crypto_ram4 : 5.05 4256.84 16.63 0.00 0.00 29714.88 1663.61 31255.63 00:28:35.787 =================================================================================================================== 00:28:35.787 Total : 19145.69 74.79 0.00 0.00 53101.62 1663.61 177451.32 00:28:35.787 00:28:35.787 real 0m7.973s 00:28:35.787 user 0m15.306s 00:28:35.787 sys 0m0.256s 00:28:35.787 10:23:57 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:35.787 10:23:57 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:35.787 ************************************ 00:28:35.787 END TEST bdev_verify 00:28:35.787 ************************************ 00:28:35.787 10:23:57 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:35.787 10:23:57 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:28:35.787 10:23:57 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:35.787 10:23:57 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:35.787 ************************************ 00:28:35.787 START TEST bdev_verify_big_io 00:28:35.787 ************************************ 00:28:35.787 10:23:57 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:35.787 [2024-06-10 10:23:57.572341] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:35.788 [2024-06-10 10:23:57.572383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1162119 ] 00:28:36.047 [2024-06-10 10:23:57.658588] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:36.047 [2024-06-10 10:23:57.724942] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:36.047 [2024-06-10 10:23:57.724961] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:36.047 [2024-06-10 10:23:57.746076] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:36.047 [2024-06-10 10:23:57.754104] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:36.047 [2024-06-10 10:23:57.762120] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:36.047 [2024-06-10 10:23:57.847324] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:38.587 [2024-06-10 10:24:00.009780] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:38.587 [2024-06-10 10:24:00.009852] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:38.587 [2024-06-10 10:24:00.009861] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.587 [2024-06-10 10:24:00.017796] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:38.587 [2024-06-10 10:24:00.017811] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:38.587 [2024-06-10 10:24:00.017817] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.587 [2024-06-10 10:24:00.025818] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:38.587 [2024-06-10 10:24:00.025835] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:38.587 [2024-06-10 10:24:00.025841] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.587 [2024-06-10 10:24:00.033843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:38.587 [2024-06-10 10:24:00.033855] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:38.587 [2024-06-10 10:24:00.033860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.587 Running I/O for 5 seconds... 00:28:41.146 [2024-06-10 10:24:02.706846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.708211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.709359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.709392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.711256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.711298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.711619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.711651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.713770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.713807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.715410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.715469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.716766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.716806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.717113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.717145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.718602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.718638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.719636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.719668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.720922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.720962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.721265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.721297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.722837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.722874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.723177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.723209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.724442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.724482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.725024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.725056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.726851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.726888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.727708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.727740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.729016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.729055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.730240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.730272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.731449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.731486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.732961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.732993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.734494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.734533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.736072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.736104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.736676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.736711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.737994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.738027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.739773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.739812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.740958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.740990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.742080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.742116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.743264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.743300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.745541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.745580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.746727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.746759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.748624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.748661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.146 [2024-06-10 10:24:02.750017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.750049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.753368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.753407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.754999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.755046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.757053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.757090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.758739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.758772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.761964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.762003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.762374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.762406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.762984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.763020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.763321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.763354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.765008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.765045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.765347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.765380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.766026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.766064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.766366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.766398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.767921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.767959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.768263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.768295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.768306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.768589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.768971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.769005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.769307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.769339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.769351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.769635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.770655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.770971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.771797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.772144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.772952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.772988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.773863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.774946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.774982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.775840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.776773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.776815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.776852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.776883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.777714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.778553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.778589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.778621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.778650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.779579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.780400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.780455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.780486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.780515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.147 [2024-06-10 10:24:02.780793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.780882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.780916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.780946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.780976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.781212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.782977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.783837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.783874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.783903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.783933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.784852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.785882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.785919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.785951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.785980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.786834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.787871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.787907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.787937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.787966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.788767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.789478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.789515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.789552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.789582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.789964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.790045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.790075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.790108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.790138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.790482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.791883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.793086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.793123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.942736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.944048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.945557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.947078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.948894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.950149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.951662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.953171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.954592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.955840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.957350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.958856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.961624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.962914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.964420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.965931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.967808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.969416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.970928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.972388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.974733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.976221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.977722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.978909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.980451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.148 [2024-06-10 10:24:02.981967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.983474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.984342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.986705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.988219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.989734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.990277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.991914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.993427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.994973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.995277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.997926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:02.999482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.000968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.001741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.003625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.005140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.006327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.006630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.149 [2024-06-10 10:24:03.009082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.411 [2024-06-10 10:24:03.010595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.411 [2024-06-10 10:24:03.011433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.411 [2024-06-10 10:24:03.012847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.411 [2024-06-10 10:24:03.014649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.411 [2024-06-10 10:24:03.016161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.016685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.016992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.019517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.021082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.021584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.022844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.024638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.026081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.026386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.026399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.026698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.026912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.029139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.030138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.030171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.031572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.033389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.033427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.034940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.034973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.035368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.036445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.037691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.037725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.039122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.040202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.040238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.041504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.041536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.041748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.042492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.042801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.042838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.044147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.045066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.045103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.046522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.046554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.046764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.047778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.048093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.048126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.049521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.050159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.050195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.051590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.051622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.051836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.052599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.053806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.053842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.054991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.056549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.056584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.057735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.057767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.058086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.059705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.061277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.061309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.062916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.064814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.064889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.066507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.066541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.066838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.067711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.068868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.068901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.070016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.071495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.071532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.072529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.072562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.072932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.073772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.074926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.074959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.075469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.076882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.076918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.077348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.077380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.077798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.078847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.080499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.080533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.081148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.082997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.083034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.083336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.412 [2024-06-10 10:24:03.083367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.083618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.084356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.085448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.085481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.086690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.087999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.088036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.088339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.088376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.088752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.089499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.090012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.090046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.091583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.092289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.092325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.092628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.092659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.092990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.093669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.094234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.094268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.095420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.095998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.096034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.096335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.096369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.096591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.097309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.098493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.098526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.099670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.100293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.100328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.100643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.100674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.100892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.101670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.103271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.103304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.104891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.105551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.105585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.106248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.106280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.106549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.107371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.108529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.108564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.109675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.110417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.110454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.111751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.111784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.112058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.112744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.113903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.113940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.114416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.115238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.115275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.116581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.116615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.116833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.117712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.119204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.119238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.119541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.120285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.120331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.120634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.120667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.120988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.121692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.122006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.122041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.122343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.123006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.123042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.123344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.123377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.123699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.124379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.124688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.124721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.125029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.125682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.125718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.126031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.126064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.126357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.127077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.413 [2024-06-10 10:24:03.127388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.127421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.127723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.128386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.128423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.128724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.128756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.129109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.129936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.130246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.130295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.130597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.131289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.131330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.131633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.131674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.132007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.132981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.133291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.133325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.133633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.134323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.134359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.134663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.134695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.135038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.135873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.136183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.136221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.136525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.136539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.136891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.137283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.137327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.137630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.137677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.138022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.139265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.139303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.139606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.139638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.139908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.140300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.140348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.140651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.140691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.140982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.141793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.141835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.141866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.141895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.142789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.143711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.143748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.143777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.143806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.144517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.145832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.146167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.146874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.146911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.146941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.146970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.147713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.148423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.148465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.148494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.148542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.148967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.149048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.149079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.414 [2024-06-10 10:24:03.149109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.149150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.149359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.150908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.151290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.151977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.152806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.153544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.153584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.153614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.153643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.154015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.154093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.154125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.155643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.155675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.155890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.156725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.158352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.158386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.159214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.159428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.159532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.160588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.160621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.161159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.161372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.162136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.163157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.163191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.164341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.164577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.164673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.165133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.165168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.166177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.166487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.167191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.168354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.168390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.169298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.169656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.169753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.170062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.170094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.171690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.171909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.172724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.174134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.174167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.174793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.175011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.175107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.175887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.175920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.176737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.176970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.177633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.178381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.178415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.179575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.179788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.179885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.181156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.181189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.181514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.181728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.182583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.183756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.183793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.184361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.184575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.184670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.186010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.186043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.186598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.186811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.415 [2024-06-10 10:24:03.187671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.188304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.188337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.189488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.189702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.189798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.190762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.190796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.191955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.192201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.192802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.193741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.193775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.194431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.194645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.194739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.195892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.195926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.196751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.196968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.197666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.199126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.199160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.199481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.199693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.199803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.200113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.200146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.201461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.201674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.204139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.205159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.205192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.206237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.206480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.206577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.207576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.207609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.208759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.209026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.209776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.210739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.210773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.212348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.212747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.212848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.214405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.214438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.214739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.214956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.215666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.217330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.217364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.218948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.219282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.219379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.220467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.220500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.221463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.221705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.224407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.225886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.225919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.227211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.227551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.227649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.229140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.229179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.229495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.229725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.232852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.234338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.234371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.235960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.236308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.236405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.237334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.416 [2024-06-10 10:24:03.237367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.238491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.238820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.241731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.242656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.242691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.244242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.244460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.244554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.246210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.246243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.247657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.247909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.250117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.251389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.251422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.252683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.252902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.253011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.253656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.253689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.254955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.255170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.258104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.259362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.259396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.259737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.259954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.260058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.261318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.261350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.262614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.262832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.265794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.267420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.267454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.268626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.268951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.269052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.270433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.270467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.270770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.417 [2024-06-10 10:24:03.270989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.274524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.276101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.276134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.277816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.278056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.278155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.279759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.279793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.280975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.281272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.283619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.284896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.284930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.286432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.286693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.286804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.288321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.288354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.289899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.290111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.292857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.294112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.294145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.295404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.295618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.295718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.296459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.296492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.297888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.298102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.300775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.301818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.301857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.303110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.303431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.303528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.305033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.305065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.305696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.679 [2024-06-10 10:24:03.305914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.309664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.310563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.310597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.310630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.310844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.310957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.311262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.311294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.312916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.313154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.315980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.317546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.317661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.317755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.319264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.319378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.322866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.324264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.325530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.326790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.327007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.327106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.327745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.329113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.330673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.330889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.335282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.336092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.337358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.338623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.338841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.340175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.341436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.342695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.343957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.344170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.345103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.346640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.346949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.348647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.348874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.350201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.351703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.352321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.353634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.353852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.355331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.356238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.357244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.358003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.358222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.359537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.360808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.362319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.363530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.363743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.368811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.369149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.370673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.370982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.371195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.372674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.373944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.375452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.376041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.376254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.381035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.382052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.382800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.383645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.383904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.385229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.386735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.388011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.389141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.389395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.392969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.394609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.394926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.396403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.396641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.397967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.399473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.400119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.401593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.401806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.405399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.406318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.407098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.408343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.408557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.680 [2024-06-10 10:24:03.410194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.411169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.412418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.413570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.413785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.417580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.418742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.419934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.419967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.420255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.420355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.421514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.422389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.422421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.422634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.425037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.425081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.425534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.425584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.425796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.427423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.427459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.428565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.428598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.428950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.432937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.432977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.434209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.434241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.434540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.435609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.435644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.437321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.437354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.437770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.442784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.442828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.444368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.444401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.444641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.445352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.445387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.446472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.446504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.446771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.450535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.450575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.451484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.451516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.451730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.453060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.453095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.453437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.453472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.453685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.457031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.457071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.458113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.458145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.458568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.458952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.458987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.459533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.459566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.459829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.464078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.464118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.464946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.464979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.465193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.465919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.465955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.467104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.467136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.467411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.471605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.471645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.471950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.472001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.472215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.472602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.472635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.473920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.473953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.474166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.478006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.478045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.478940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.478972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.479259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.480160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.480196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.481112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.681 [2024-06-10 10:24:03.481145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.481357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.485065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.485104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.486610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.486643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.487060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.488636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.488672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.489811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.489847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.490154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.493631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.493670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.494782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.494818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.495065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.496221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.496256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.497714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.497747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.497991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.499540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.499580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.500243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.500276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.500490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.501050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.501085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.502383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.502415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.502728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.506700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.506739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.507047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.507095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.507310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.507706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.507740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.508994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.509027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.509240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.513019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.513058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.514098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.514131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.514502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.515638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.515673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.516464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.516496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.516725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.520921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.520962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.522177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.522209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.522626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.523948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.523983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.524285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.524317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.524529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.526549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.526589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.526896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.526938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.527256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.528836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.528876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.529492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.529524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.529738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.531472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.531535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.533043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.533075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.533417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.533807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.533852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.534154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.534186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.534400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.535998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.536039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.537678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.537711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.538101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.539622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.539657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.539964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.539998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.540350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.541900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.682 [2024-06-10 10:24:03.541940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.683 [2024-06-10 10:24:03.542244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.683 [2024-06-10 10:24:03.542278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.542639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.544377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.544413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.544806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.544842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.545054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.546842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.546881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.548536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.548569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.548951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.549339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.549374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.549691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.549738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.945 [2024-06-10 10:24:03.549958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.551662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.551702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.553356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.553390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.553752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.555325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.555359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.555662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.555695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.556060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.557605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.557645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.557955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.557989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.558258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.559889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.559924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.560411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.560443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.560655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.562427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.562467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.564071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.564104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.564562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.564948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.564984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.565286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.565318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.565533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.567317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.568874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.569181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.569223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.569435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.569810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.569849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.570162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.570195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.570545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.577400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.577440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.577487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.577517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.577728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.578493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.578529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.578559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.578590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.578901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.581615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.581655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.581685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.581714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.582495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.586899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.591713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.592021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.594999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.595033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.595067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.595098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.595307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.599157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.946 [2024-06-10 10:24:03.599197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.599941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.603862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.603903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.603933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.603964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.604735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.607849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.611468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.611507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.611537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.613158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.613626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.615297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.615332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.615362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.615664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.615887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.619079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.619817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.619857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.620908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.621165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.621267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.622129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.622163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.622819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.623042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.626773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.628021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.628054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.628406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.628621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.628721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.629032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.629073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.630579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.630853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.633520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.634694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.634727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.635591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.635843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.635964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.636617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.636651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.637799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.638020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.640849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.641188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.641234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.642854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.643303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.643403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.644741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.644774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.645994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.646305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.649174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.649931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.649965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.650799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.651027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.651126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.652345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.652378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.653727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.654014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.656966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.658395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.658428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.658731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.947 [2024-06-10 10:24:03.659004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.659102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.660248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.660282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.660909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.661148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.664918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.666097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.666130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.666484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.666698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.666797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.668413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.668446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.669124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.669379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.672533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.672942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.672975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.674032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.674283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.674381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.675268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.675302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.676684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.676948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.678762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.679297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.679330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.680386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.680636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.680736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.681468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.681501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.683059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.683460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.686555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.687968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.688001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.688753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.689049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.689147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.690425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.690459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.691863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.692142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.695969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.697399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.697433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.699035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.699253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.699357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.700049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.700083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.701559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.701778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.703743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.704160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.704193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.705453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.705669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.705782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.707292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.707324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.708572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.708790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.712325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.713874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.713908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.714392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.714609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.714709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.715226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.715259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.716512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.716727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.720013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.721276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.721309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.722808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.723070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.723184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.724532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.724566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.725236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.725450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.728165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.729636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.729670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.730859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.731129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.731227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.732480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.732514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.948 [2024-06-10 10:24:03.734029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.734366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.738894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.740158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.740191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.741450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.741664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.741762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.743059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.743091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.744448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.744719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.747659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.748104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.748138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.749297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.749607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.749709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.750976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.751008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.752271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.752485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.755941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.757458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.757492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.758222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.758436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.758535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.759124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.759158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.760159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.760414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.764020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.765185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.765218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.766481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.766737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.766847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.768348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.768381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.769099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.769315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.770816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.772127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.772160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.773667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.773887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.773986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.775292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.775325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.776583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.776834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.781210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.782550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.782584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.783128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.783343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.783438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.784767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.784801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.786311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.786526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.789529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.790386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.790420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.791833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.792227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.792331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.793563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.793596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.794225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.794453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.798389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.798432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.798462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.799724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.799965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.800062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.801562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.801594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.802324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.802539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.804012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.805374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.806899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:41.949 [2024-06-10 10:24:03.808362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.808666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.808767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.810037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.811302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.812806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.813108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.818773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.820152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.821398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.822905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.823186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.824704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.826328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.827829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.829409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.829711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.834081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.835343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.836852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.838195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.838423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.839746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.841017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.842519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.843490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.843704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.846305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.847572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.849080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.849691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.849907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.851582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.853140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.854763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.855651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.855895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.859247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.860494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.861693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.862958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.863247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.864265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.865843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.866147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.867782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.868228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.872289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.873180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.874779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.875088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.875302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.875688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.877282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.878434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.879040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.879253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.884853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.885164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.886680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.888310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.888541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.889179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.890484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.891973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.893508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.893817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.898542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.899841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.900738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.901885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.902099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.903583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.903905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.905473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.905775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.906051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.909869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.911532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.911843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.911875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.912090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.912208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.912512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.914003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.914042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.914258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.917791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.917835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.918855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.918890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.919259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.920125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.920160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.921312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.921343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.921582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.924736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.924792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.926355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.926387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.926779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.928402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.928437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.929849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.929881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.930227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.934049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.934088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.934828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.934861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.935102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.936334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.936368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.937534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.937565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.937829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.942412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.942451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.942755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.942786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.943042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.944275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.944310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.944791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.944827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.945041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.949830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.949870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.950247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.950278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.950491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.952084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.952119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.952803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.952839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.953074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.956362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.956401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.957376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.957408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.957655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.958659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.958694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.960021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.960051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.960329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.962371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.220 [2024-06-10 10:24:03.962410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.963947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.963979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.964196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.964765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.964800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.965957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.965988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.966200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.970045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.970086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.971230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.971263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.971574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.972779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.972813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.973966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.973998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.974318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.980308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.980347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.980650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.980681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.980901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.981403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.981436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.982591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.982624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.982872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.987474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.987514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.988667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.988699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.988918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.990254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.990289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.991549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.991581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.991795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.993875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.993918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.995433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.995465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.995681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.996058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.996093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.997544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.997577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.997790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:03.999990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.000029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.000345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.000378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.000651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.001449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.001486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.002790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.002826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.003238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.006012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.006052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.006634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.006664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.006986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.007386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.007424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.008605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.008636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.009042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.011741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.011781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.012090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.012122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.012366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.012756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.012799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.013108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.013139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.013385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.015507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.015547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.015856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.015889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.016215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.016608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.016642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.016950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.016982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.017442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.019268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.019307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.019610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.019642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.019997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.020385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.020422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.020726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.020756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.021094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.022929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.022968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.023270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.023320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.023749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.024154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.024187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.024490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.024521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.024774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.026817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.026873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.027175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.027208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.027659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.028036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.028083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.029546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.029579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.029803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.032711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.032750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.033058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.033089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.033306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.035063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.035098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.035759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.035790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.036091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.039796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.039841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.040435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.040465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.040762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.041147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.041181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.041483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.041515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.041727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.045557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.045596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.045909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.045941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.046380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.048011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.048046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.049225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.049257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.049549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.053126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.053166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.054423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.054455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.054673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.055393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.055427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.056337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.056368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.056582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.060315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.061232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.062401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.062432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.062646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.063624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.063658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.064511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.064543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.064754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.067276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.067316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.067346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.067375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.067706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.068790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.068829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.221 [2024-06-10 10:24:04.068859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.068888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.069099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.070860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.071652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.071687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.071717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.071746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.072505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.073872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.074083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.074961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.074997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.075730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.076591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.076627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.076680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.076709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.076925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.077005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.077036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.077066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.077096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.077306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.078828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.079607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.079642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.079673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.079702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.079920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.080002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.080032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.080062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.080091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.080511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.081507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.081543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.081573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.083114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.222 [2024-06-10 10:24:04.083434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.084768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.084803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.084842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.086311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.086576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.087292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.088207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.088240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.089205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.089418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.089514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.090436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.090468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.091122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.091505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.092840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.094305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.094337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.095069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.095340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.095452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.096880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.096912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.097215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.097537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.098322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.098867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.098900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.100543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.100790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.100889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.101194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.101225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.101527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.101921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.102773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.103976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.104009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.104401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.104747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.104868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.105670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.105702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.106576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.106788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.108230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.109574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.109607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.110932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.111335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.111444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.113006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.113038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.114604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.114932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.115649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.116915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.116948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.118208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.118428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.118525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.486 [2024-06-10 10:24:04.119404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.119437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.120702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.120918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.121638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.121967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.121999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.122338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.122553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.122672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.124085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.124117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.125551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.125764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.126529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.127801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.127836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.129011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.129225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.129319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.130940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.130972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.131339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.131552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.134134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.135611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.135643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.136717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.136967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.137069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.138328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.138360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.139618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.139848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.144782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.146289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.146327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.147846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.148060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.148160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.149649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.149682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.150713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.150976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.151654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.152934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.152977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.153278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.153700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.153809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.155254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.155287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.156760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.156976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.157744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.159102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.159137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.160358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.160643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.160743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.161514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.161546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.161852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.162169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.162853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.164301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.164334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.165797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.166014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.166111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.167370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.167401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.168663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.168987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.170345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.170789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.170826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.172084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.172297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.172411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.173790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.173825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.175236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.175449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.176257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.177512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.177547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.178485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.178769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.178868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.179311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.179343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.180605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.487 [2024-06-10 10:24:04.180818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.181498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.182531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.182565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.183827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.184078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.184173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.185434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.185466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.186388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.186766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.188030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.189693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.189725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.191378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.191606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.191703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.192671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.192703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.193959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.194205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.194884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.195210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.195257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.195559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.195773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.195904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.197370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.197416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.198915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.199127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.199873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.201135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.201167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.202430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.202644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.202740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.203049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.203080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.203488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.203701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.204387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.205769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.205801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.206800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.207040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.207139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.208405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.208436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.209677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.209891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.211065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.212696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.212728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.214392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.214610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.214705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.216030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.216070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.217032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.217297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.217968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.219390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.219422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.219734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.220145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.220258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.221855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.221886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.223532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.223749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.224554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.224593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.224622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.225882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.226188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.226300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.227563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.227595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.228667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.229084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.231475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.232298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.233713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.235329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.235586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.235684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.236212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.236517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.236974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.237186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.241743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.242561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.488 [2024-06-10 10:24:04.243582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.244324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.244601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.245953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.247461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.248025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.249220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.249432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.251061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.251974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.253326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.254361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.254634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.256045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.256351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.256653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.257835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.258180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.259776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.261056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.262272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.262823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.263042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.263663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.264747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.265657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.266869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.267110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.268160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.268470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.269557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.270460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.270673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.271695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.272606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.273937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.274242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.274554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.276650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.277687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.278590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.279870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.280161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.280530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.281652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.282553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.283724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.283946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.284949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.285258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.286419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.287326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.287541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.288658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.289574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.290783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.291089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.291374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.293346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.294568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.295471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.296556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.296927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.297312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.298706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.299615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.300500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.300713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.301848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.302156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.303775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.303807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.304048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.304145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.304700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.306030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.306062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.306273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.307655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.307692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.308726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.308757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.308971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.309778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.309813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.310723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.310754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.310969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.312751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.312789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.313694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.313725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.313958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.489 [2024-06-10 10:24:04.315372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.315405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.316374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.316405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.316642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.319802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.319852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.320809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.320843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.321103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.322527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.322561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.323773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.323804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.324051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.326097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.326133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.327388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.327419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.327630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.329175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.329215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.330599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.330630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.330917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.334744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.334783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.335668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.335700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.335915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.336741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.336778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.337584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.337615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.337828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.339241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.339291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.340790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.340824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.341139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.342689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.342723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.343028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.343059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.343501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.344749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.344797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.345104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.345136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.345570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.345959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.345997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.346300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.346331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.346765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.348600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.490 [2024-06-10 10:24:04.348639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.348954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.348987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.349371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.349741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.349775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.350090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.350123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.350428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.351658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.351695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.352003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.352035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.352363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.352736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.352769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.353094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.353130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.353447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.354665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.354704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.355012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.355044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.355405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.355781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.355814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.356142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.356175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.356520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.357735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.357774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.358082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.358114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.358395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.358786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.358819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.359141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.359182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.359487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.360808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.360851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.361152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.361183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.361441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.361836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.755 [2024-06-10 10:24:04.361886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.362188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.362219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.362454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.364088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.364126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.364428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.364459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.364758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.366368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.366402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.368056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.368088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.368379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.369339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.369376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.369677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.369709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.369946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.371184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.371218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.372168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.372198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.372471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.373504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.373541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.373848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.373879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.374144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.374641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.374674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.375963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.375995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.376347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.378682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.378720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.380345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.380381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.380674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.381747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.381781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.383157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.383191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.383582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.385210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.385249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.386241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.386272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.386485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.387471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.387504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.388160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.388192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.388560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.390645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.390682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.391184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.391215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.391441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.393079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.393121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.393423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.393453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.393765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.395931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.395969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.397112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.397143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.397497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.398595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.398630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.398951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.398983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.399431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.400790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.400844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.402328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.402366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.402602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.403080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.403113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.403415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.403446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.403715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.405049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.405087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.405996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.406981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.756 [2024-06-10 10:24:04.407194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.409232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.409269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.410199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.410230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.410461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.410836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.410868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.411277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.411309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.411526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.413620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.415178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.415483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.415514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.415783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.416708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.416742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.417646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.417677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.417892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.419866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.419903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.419932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.419979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.420423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.420810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.420847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.420877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.420907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.421142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.421931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.421967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.421996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.422719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.423971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.424654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.425950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.426314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.427758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.428004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.428870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.428906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.428935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.428963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.429774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.430990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.431020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.431476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.432539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.757 [2024-06-10 10:24:04.432575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.432985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.433013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.433223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.433961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.434000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.434030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.435535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.435776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.436158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.436191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.436221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.436523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.436855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.437709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.438207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.438238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.439517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.439730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.439838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.440600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.440632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.441911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.442123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.443135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.444621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.444654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.445889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.446107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.446200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.447707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.447739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.448391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.448611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.449279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.450773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.450805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.451109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.451455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.451564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.453014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.453045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.454467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.454708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.455488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.456841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.456873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.458244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.458458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.458553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.459997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.460029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.460330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.460617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.461321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.462585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.462617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.464124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.464412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.464510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.465812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.465847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.467167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.467378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.468087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.468395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.468432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.469753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.469993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.470109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.471370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.471401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.472905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.473293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.473964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.475596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.475631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.477099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.477489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.477583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.477892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.477924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.479286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.479518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.480273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.481063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.481096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.482523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.482736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.758 [2024-06-10 10:24:04.482856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.484509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.484540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.485980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.486301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.486984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.488248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.488280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.489538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.489751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.489852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.490534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.490566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.491897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.492108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.492773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.493085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.493117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.493418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.493632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.493748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.495012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.495044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.496295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.496507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.497210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.498731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.498766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.500251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.500463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.500558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.500864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.500895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.501196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.501410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.502105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.503612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.503644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.504509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.504721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.504840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.506441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.506473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.508038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.508250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.509142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.509994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.510026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.511286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.511547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.511642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.513149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.513180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.514022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.514234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.515030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.516536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.516568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.516934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.517240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.517342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.518183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.518215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.519476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.519737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.520403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.521676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.521708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.522971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.523225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.523335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.524846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.524877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.525191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.525439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.526257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.527523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.527563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.529049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.529261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.529355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.530681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.530713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.532018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.759 [2024-06-10 10:24:04.532278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.533021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.533330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.533361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.533941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.534154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.534249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.535555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.535586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.537091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.537302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.538057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.539325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.539356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.540864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.541138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.541233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.541535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.541565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.542113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.542327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.543008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.543666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.543699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.544957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.545188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.545286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.546634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.546665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.546970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.547359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.548182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.549207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.549238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.550505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.550752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.760 [2024-06-10 10:24:04.550853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.551569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.551601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.551907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.552265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.552973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.553410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.553453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.554951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.555163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.555263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.555568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.555598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.555904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.556120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.556859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.557863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.557895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.558800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.559976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.560725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.560763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.560795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.562264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.562534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.562643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.563001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.563032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.563334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.563621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.564285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.564984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.566026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.567613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.567959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.568073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.568377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.569497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.570406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.570618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.572787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.573099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.573402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.574559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.574880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.576166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.577317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.578224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.579381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.579731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.581318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.582473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.583688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.584600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.584812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.585182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.585487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.586884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.587794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.588037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.589985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.590295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.590599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.592102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.592402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.593314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.594908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.595951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.596665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.597027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.599108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.599930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.601523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.602566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.602840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.603218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.603522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.605020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.606288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.606622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.608019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.608329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.608666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.610090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.610359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.610957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.612322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.613741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.614074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.761 [2024-06-10 10:24:04.614359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.762 [2024-06-10 10:24:04.616543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:42.762 [2024-06-10 10:24:04.616985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.618276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.619772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.620042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.620412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.621123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.622134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.623785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.624108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.625054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.625364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.626227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.627476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.627709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.629307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.630507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.631766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.633031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.633270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.634211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.634538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.635377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.635408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.635663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.635763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.636631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.637512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.637543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.637889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.639852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.639889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.026 [2024-06-10 10:24:04.641417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.641448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.641666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.643103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.643137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.644481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.644511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.644776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.645814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.645856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.646159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.646191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.646405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.648070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.648104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.648525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.648555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.648767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.649863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.649900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.650202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.650233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.650586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.650962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.650997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.651298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.651329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.651647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.652852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.652889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.653191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.653221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.653528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.653914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.653947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.654261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.654296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.654696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.655793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.655835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.656138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.656169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.656524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.656903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.656937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.657251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.657281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.657600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.658789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.658830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.659133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.659164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.659475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.659850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.659884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.660201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.660232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.660519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.661688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.661726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.662033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.662065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.662407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.662785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.662818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.663125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.663160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.663409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.664783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.664825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.665128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.665159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.665466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.665856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.665890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.666192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.666222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.666517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.667924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.667961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.669498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.669529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.669828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.670378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.670411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.671618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.671649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.671865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.027 [2024-06-10 10:24:04.673494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.673532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.674434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.674464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.674676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.675812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.675849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.676758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.676788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.677078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.679788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.679829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.680904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.680936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.681242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.682787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.682841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.684225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.684256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.684591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.687204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.687242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.688802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.688835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.689119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.690511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.690545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.691988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.692019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.692386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.694332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.694369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.695514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.695545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.695777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.696998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.697031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.697988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.698038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.698446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.700890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.700928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.701619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.701650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.701865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.703147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.703180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.703550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.703580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.703814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.706036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.706075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.706536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.706567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.706784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.708396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.708430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.708731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.708761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.709065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.711201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.711238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.712142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.712172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.712475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.713632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.713667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.713993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.714039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.714362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.715954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.715993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.717366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.717397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.717713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.718384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.718418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.718721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.718753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.719010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.720062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.720110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.721630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.721661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.721876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.722264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.722298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.722599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.028 [2024-06-10 10:24:04.722629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.722917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.724058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.724095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.725254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.725285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.725498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.725888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.725921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.726242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.726274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.726518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.727528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.727569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.728724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.728755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.728971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.729341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.729375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.729675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.729706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.729955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.731598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.731636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.732789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.732820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.733103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.733489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.733522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.733829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.733861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.734075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.736257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.736306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.737666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.737697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.738000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.738373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.738406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.738874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.738906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.739135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.741349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.743010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.743315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.743345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.743580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.744580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.744615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.745255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.745286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.745500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.746589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.746626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.746656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.746685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.747000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.748602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.748635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.748664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.748694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.748998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.749753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.749787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.749816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.749850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.750632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.751862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.752431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.754849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.755668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.029 [2024-06-10 10:24:04.755703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.755732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.755765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.756548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.757941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.758234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.758910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.758946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.758975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.759670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.760929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.761143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.761889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.761923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.761952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.762839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.763079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.764400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.764433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.764462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.765932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.766174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.767308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.768904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.768937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.770492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.770758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.770873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.772203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.772235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.773073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.773347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.774015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.775668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.775700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.776006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.776359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.776470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.777864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.777895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.779254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.779485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.780244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.781745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.781781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.783292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.783504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.783601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.785094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.785138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.785440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.785702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.786460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.787727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.787759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.789025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.789238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.789333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.790947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.790979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.792561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.792809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.793621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.793933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.793965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.794840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.795069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.030 [2024-06-10 10:24:04.795177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.796439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.796470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.797733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.797947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.798632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.799897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.799928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.801196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.801450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.801546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.801853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.801884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.802596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.802825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.803481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.805094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.805133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.806308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.806565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.806660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.807930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.807962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.809220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.809452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.810672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.812237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.812269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.813799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.814099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.814193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.815462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.815493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.816350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.816652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.817368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.818847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.818879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.819182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.819478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.819594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.821214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.821246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.822827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.823118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.823887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.825146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.825178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.826440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.826652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.826747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.828184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.828215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.828519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.828916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.829679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.830947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.830979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.832241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.832500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.832596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.833859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.833890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.835147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.835360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.836135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.836443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.836477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.837963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.838231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.838340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.839603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.839644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.840919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.841173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.841878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.843243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.031 [2024-06-10 10:24:04.843275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.844672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.844958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.845067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.845370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.845401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.846999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.847273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.848032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.848905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.848938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.850193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.850406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.850503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.851876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.851907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.853310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.853680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.854475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.855392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.855424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.856225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.856439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.856536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.857622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.857654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.858164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.858565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.859480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.861069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.861103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.861911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.862157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.862253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.863678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.863710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.864017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.864370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.865169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.866442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.866475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.867433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.867646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.867744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.869077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.869109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.870388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.870656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.871464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.872817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.872855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.874451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.874832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.874949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.875910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.875942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.877295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.877589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.878408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.879319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.879352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.880407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.880663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.880761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.881672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.881705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.882421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.882764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.883674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.885036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.885069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.885664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.885881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.885979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.887630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.887661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.032 [2024-06-10 10:24:04.887969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.888376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.889144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.890494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.890526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.891611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.891917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.892012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.893141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.893178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.893481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.893862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.894671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.895661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.895694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.897271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.897591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.897699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.898289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.898321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.898622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.898949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.899846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.899888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.899918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.901106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.901400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.297 [2024-06-10 10:24:04.901495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.902551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.902583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.902888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.903295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.904087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.904893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.906498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.907565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.907790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.907896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.908204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.908511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.909989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.910262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.912290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.912763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.913072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.913416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.913628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.914957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.915470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.916881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.918245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.918508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.920813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.922202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.922740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.924026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.924242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.924685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.924995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.925448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.926757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.926986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.929110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.929422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.929724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.930248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.930460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.931936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.932417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.933699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.935224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.935494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.937812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.939315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.939622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.941192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.941404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.941773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.942082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.942998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.944270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.944564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.946501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.947779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.948722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.949026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.949412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.950134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.951395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.952662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.953352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.953584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.954815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.956239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.957495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.958756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.959070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.960082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.961578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.963217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.964687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.964963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.966115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.967254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.968752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.969311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.969617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.971194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.972123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.972444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.972756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.972974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.974661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.974977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.975284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.975316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.975633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.975746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.976055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.976362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.298 [2024-06-10 10:24:04.976394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.976698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.977855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.977893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.978195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.978225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.978480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.978877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.978915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.979218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.979249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.979489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.980600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.980639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.980946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.980978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.981290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.981676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.981709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.982015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.982046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.982365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.983579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.983618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.983924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.983955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.984316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.984707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.984739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.985046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.985077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.985460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.986999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.987037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.987340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.987372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.987733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.988127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.988161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.988462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.988493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.988843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.990066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.990105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.990406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.990437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.990769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.991228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.991261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.992459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.992490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.992830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.993922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.993961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.995247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.995278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.995584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.996595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.996629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.998264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.998296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:04.998529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.000264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.000302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.001726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.001758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.001977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.002547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.002580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.003707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.003748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.003965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.005078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.005116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.006386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.006417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.006628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.007221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.007254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.008409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.008440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.008651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.010264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.010303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.011477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.011508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.011753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.013001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.013035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.014188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.014219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.014503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.017054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.017092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.299 [2024-06-10 10:24:05.018486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.018517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.018810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.020293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.020338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.021843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.021874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.022159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.024044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.024085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.025230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.025261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.025499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.026711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.026746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.027702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.027749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.028215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.030501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.030540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.031365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.031414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.031652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.032877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.032911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.033399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.033430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.033694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.035855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.035893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.036282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.036313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.036524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.038201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.038235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.038537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.038569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.038867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.040996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.041041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.042054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.042084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.042310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.043357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.043414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.043716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.043747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.044109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.045565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.045604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.047249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.047291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.047505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.047879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.047913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.048215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.048245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.048549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.049855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.049893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.051047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.051077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.051289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.051659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.051692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.052000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.052032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.052274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.054244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.054281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.055442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.055473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.055730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.056111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.056145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.056447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.056477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.056692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.058946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.058985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.060540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.060571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.060968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.061352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.061385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.062220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.062251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.062535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.064414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.064452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.065334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.065365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.065793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.066188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.300 [2024-06-10 10:24:05.066222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.067759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.067791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.068011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.070338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.070377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.070684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.070714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.070963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.071334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.071368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.071670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.071714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.072050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.074097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.074135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.075395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.075426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.075639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.076858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.076892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.078151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.078182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.078421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.080125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.080163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.080467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.080497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.080801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.081201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.081235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.081536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.081567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.081786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.082952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.084224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.085728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.085762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.085978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.086352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.086387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.086690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.086720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.087023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.088453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.088492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.088539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.088568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.088779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.090430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.090476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.090506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.090535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.090785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.091499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.091535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.091574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.091605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.091964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.092057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.092088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.092119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.092150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.092543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.093736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.094040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.094758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.094793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.094828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.094859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.095545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.097892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.098117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.301 [2024-06-10 10:24:05.098919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.098955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.098985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.099663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.100982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.101011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.101040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.101329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.102766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.103479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.103514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.103544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.105054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.105276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.105647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.105679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.105727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.106036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.106251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.107010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.108528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.108561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.109358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.109694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.109803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.111075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.111107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.112617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.112877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.114091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.115679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.115712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.117267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.117524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.117621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.119177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.119211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.120156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.120417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.121124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.122134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.122181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.122483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.122945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.123061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.124594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.124627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.126167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.126379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.127151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.128415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.128448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.129707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.129925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.130020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.130915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.302 [2024-06-10 10:24:05.130950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.131255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.131695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.132409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.133897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.133937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.135550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.135815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.135937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.137195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.137226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.138496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.138710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.139952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.140262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.140294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.141681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.141898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.141994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.143641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.143674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.145137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.145405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.146138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.147647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.147680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.148552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.148929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.149058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.149362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.149394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.150836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.151050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.151754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.152584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.152617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.153878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.154165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.154267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.155773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.155805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.156619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.157074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.303 [2024-06-10 10:24:05.157974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.159528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.159562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.161095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.161309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.161414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.162272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.162305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.163563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.163843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.164549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.164875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.164923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.165225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.165440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.165557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.166892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.166924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.167470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.167681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.168495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.168805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.168842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.169755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.170003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.170112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.171381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.171413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.172682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.172988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.173887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.174196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.174228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.175790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.176157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.176267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.177767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.177802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.178427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.178639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.179363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.179673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.179704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.180011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.180299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.180418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.181572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.181603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.182658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.182911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.183633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.183946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.184000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.184301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.184620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.184738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.185975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.186007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.186368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.186580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.187454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.187764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.187796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.188353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.188568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.188663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.190096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.190128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.191017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.191260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.192122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.192431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.192462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.566 [2024-06-10 10:24:05.193646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.193893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.194000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.194749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.194781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.196340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.196602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.197630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.197943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.197976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.199396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.199609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.199711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.200096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.200127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.201338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.201552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.202338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.203052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.203085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.204236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.204448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.204542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.205555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.205586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.206743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.206984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.208331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.209674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.209706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.210916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.211207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.211304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.212841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.212873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.214275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.214630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.215629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.217066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.217100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.218668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.218937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.219035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.220186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.220217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.221486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.221828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.222620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.223776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.223808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.224804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.225090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.225204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.226362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.226393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.226994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.227345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.228693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.229984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.230016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.230317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.230529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.230639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.232165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.232197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.232499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.232726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.233464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.233503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.233532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.234693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.234940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.235058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.236211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.236242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.237075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.237540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.238591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.239761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.240445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.241874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.242138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.242248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.242750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.243059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.243364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.243628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.245556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.247071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.247535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.247840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.248193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.249677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.567 [2024-06-10 10:24:05.251243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.252817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.253434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.253646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.254644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.254959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.255275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.256865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.257077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.257446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.259049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.260580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.260886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.261236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.263260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.264767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.265931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.267243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.267545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.268895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.270405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.271146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.271449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.271722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.273951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.274261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.274566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.275227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.275458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.276920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.278513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.280027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.280906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.281145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.283053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.283365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.284828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.285131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.285398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.286626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.287844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.288452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.289622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.289838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.291337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.291650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.291957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.292260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.292485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.293314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.293621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.293943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.294246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.294591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.295636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.295974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.296464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.297743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.298083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.298473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.298778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.299099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.300631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.301034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.302239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.303749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.304060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.304090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.304423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.304519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.304828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.305135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.305167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.305379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.306708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.306747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.307055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.307087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.307302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.307851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.307887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.308188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.308218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.308515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.309905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.309948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.310251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.310281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.310688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.311077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.311111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.312271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.312302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.568 [2024-06-10 10:24:05.312620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.313692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.313731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.315029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.315061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.315414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.317041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.317075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.317378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.317408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.317694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.319811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.319853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.321101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.321132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.321390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.322159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.322194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.323070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.323100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.323315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.324391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.324429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.325591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.325622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.325838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.326212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.326246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.327750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.327788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.328216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.329540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.329578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.330593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.330624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.331014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.332716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.332751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.333056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.333086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.333297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.335513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.335552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.337137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.337177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.337536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.337924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.337958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.339157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.339188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.339492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.341096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.341135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.341836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.341867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.342245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.342638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.342671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.343934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.343966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.344179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.346661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.346699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.347005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.347038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.347251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.348237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.348271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.349043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.349076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.349288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.350439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.350478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.350990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.351022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.351267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.352940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.352979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.353777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.353807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.354081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.355163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.355201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.356689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.356727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.357042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.357826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.357865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.359328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.359371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.359616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.361282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.361322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.362224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.569 [2024-06-10 10:24:05.362255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.362468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.363667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.363700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.364612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.364644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.364958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.368090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.368129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.369520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.369551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.369813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.371165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.371200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.372856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.372889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.373212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.374791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.374835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.375868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.375902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.376116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.377096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.377130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.377935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.377966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.378340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.381048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.381089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.381902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.381934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.382206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.383631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.383666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.383993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.384043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.384493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.385853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.385892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.387224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.387256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.387468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.387841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.387876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.388178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.388207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.388420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.390252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.390292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.391806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.391845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.392138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.393734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.393768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.394110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.394141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.394406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.395548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.395586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.395895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.395927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.396164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.397515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.397549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.399059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.399090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.399319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.401572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.401611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.402080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.402111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.402417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.402787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.402826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.403131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.403162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.570 [2024-06-10 10:24:05.403443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.405458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.405496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.407008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.407040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.407318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.408730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.408764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.410340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.410372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.410642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.412171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.412211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.413302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.413333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.413627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.414006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.414039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.414341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.414370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.414690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.416690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.416729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.417040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.417090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.417556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.419270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.419304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.420453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.420485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.420749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.423051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.424659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.424981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.425013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.425225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.426833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.426869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.427172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.427202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.571 [2024-06-10 10:24:05.427529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.429482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.429521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.429550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.429579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.429793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.430728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.430763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.430794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.430830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.431078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.431840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.431877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.431906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.431935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.432246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.432326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.432357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.432387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.833 [2024-06-10 10:24:05.432417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.432702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.433978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.434189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.434970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.435702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.436999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.437028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.437238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.437924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.437978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.438626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.439389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.439436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.439465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.439494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.439947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.440027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.440071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.440101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.440130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.440481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.441894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.442582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.442619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.442648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.444296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.444538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.445000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.445033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.445068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.445370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.445583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.446425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.447704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.447736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.448608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.448828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.448935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.450255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.450287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.451628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.451845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.452654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.454119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.454152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.455586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.455852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.455951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.457209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.457241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.458147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.458448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.834 [2024-06-10 10:24:05.459134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.460687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.460720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.461027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.461363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.461474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.463133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.463179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.464790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.465013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.465832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.467102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.467134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.468393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.468608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.468704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.470011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.470044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.470347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.470773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.471455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.472723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.472756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.474026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.474264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.474375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.475629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.475661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.476888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.477100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.477898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.478208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.478239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.479752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.480024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.480120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.481390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.481421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.482681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.482949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.483290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.484952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.484985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.486646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.486895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.487002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.487306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.487337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.487639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.487928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.488622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.489315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.489348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.490258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.490471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.490567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.490874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.490912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.491212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.491452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.492183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.493369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.493401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.494309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.494579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.494676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.494986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.495018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.495367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.495582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.496366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.497921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.497954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.499530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.499879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.499996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.500298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.500329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.501506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.501832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.502546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.503454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.503488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.504555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.504872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.504974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.506595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.506628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.507866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.508209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.508995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.509731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.509763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.835 [2024-06-10 10:24:05.510072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.510400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.510507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.511708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.511741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.513317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.513627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.514274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.514582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.514614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.514921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.515139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.515258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.516164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.516195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.517066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.517279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.518057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.518365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.518417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.518719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.518941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.519061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.520474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.520506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.521132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.521351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.522154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.522462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.522494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.523361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.523608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.523722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.525003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.525034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.526268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.526595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.527340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.527649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.527679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.527996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.528210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.528330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.529858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:43.836 [2024-06-10 10:24:05.529890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:44.096 00:28:44.096 Latency(us) 00:28:44.096 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:44.096 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x0 length 0x100 00:28:44.096 crypto_ram : 5.71 44.87 2.80 0.00 0.00 2758471.68 70173.93 2452054.65 00:28:44.096 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x100 length 0x100 00:28:44.096 crypto_ram : 5.70 44.94 2.81 0.00 0.00 2742585.50 86709.17 2413337.99 00:28:44.096 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x0 length 0x100 00:28:44.096 crypto_ram2 : 5.71 44.86 2.80 0.00 0.00 2665623.24 69770.63 2464960.20 00:28:44.096 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x100 length 0x100 00:28:44.096 crypto_ram2 : 5.71 47.46 2.97 0.00 0.00 2539164.42 6704.84 2387526.89 00:28:44.096 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x0 length 0x100 00:28:44.096 crypto_ram3 : 5.53 305.75 19.11 0.00 0.00 376514.26 20164.92 554938.68 00:28:44.096 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x100 length 0x100 00:28:44.096 crypto_ram3 : 5.54 315.87 19.74 0.00 0.00 364595.92 29239.14 551712.30 00:28:44.096 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:44.096 Verification LBA range: start 0x0 length 0x100 00:28:44.096 crypto_ram4 : 5.63 323.27 20.20 0.00 0.00 345909.63 13409.67 425883.18 00:28:44.097 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:44.097 Verification LBA range: start 0x100 length 0x100 00:28:44.097 crypto_ram4 : 5.64 332.70 20.79 0.00 0.00 336352.48 13712.15 471052.60 00:28:44.097 =================================================================================================================== 00:28:44.097 Total : 1459.72 91.23 0.00 0.00 649921.84 6704.84 2464960.20 00:28:44.356 00:28:44.356 real 0m8.578s 00:28:44.356 user 0m16.540s 00:28:44.356 sys 0m0.279s 00:28:44.356 10:24:06 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:44.356 10:24:06 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:44.356 ************************************ 00:28:44.356 END TEST bdev_verify_big_io 00:28:44.356 ************************************ 00:28:44.356 10:24:06 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:44.356 10:24:06 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:44.356 10:24:06 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:44.356 10:24:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:44.356 ************************************ 00:28:44.356 START TEST bdev_write_zeroes 00:28:44.356 ************************************ 00:28:44.356 10:24:06 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:44.356 [2024-06-10 10:24:06.221257] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:44.356 [2024-06-10 10:24:06.221304] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1163758 ] 00:28:44.614 [2024-06-10 10:24:06.309279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:44.614 [2024-06-10 10:24:06.386314] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:44.614 [2024-06-10 10:24:06.407320] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:44.614 [2024-06-10 10:24:06.415351] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:44.614 [2024-06-10 10:24:06.423367] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:44.874 [2024-06-10 10:24:06.506163] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:47.413 [2024-06-10 10:24:08.673452] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:47.413 [2024-06-10 10:24:08.673506] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:47.413 [2024-06-10 10:24:08.673514] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:47.413 [2024-06-10 10:24:08.681470] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:47.413 [2024-06-10 10:24:08.681481] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:47.413 [2024-06-10 10:24:08.681486] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:47.413 [2024-06-10 10:24:08.689489] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:47.413 [2024-06-10 10:24:08.689499] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:47.413 [2024-06-10 10:24:08.689505] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:47.413 [2024-06-10 10:24:08.697509] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:47.413 [2024-06-10 10:24:08.697518] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:47.413 [2024-06-10 10:24:08.697523] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:47.413 Running I/O for 1 seconds... 00:28:47.983 00:28:47.984 Latency(us) 00:28:47.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:47.984 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:47.984 crypto_ram : 1.02 2358.04 9.21 0.00 0.00 54030.59 4688.34 64527.75 00:28:47.984 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:47.984 crypto_ram2 : 1.02 2363.77 9.23 0.00 0.00 53636.54 4688.34 59688.17 00:28:47.984 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:47.984 crypto_ram3 : 1.02 18193.67 71.07 0.00 0.00 6946.76 2155.13 8922.98 00:28:47.984 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:47.984 crypto_ram4 : 1.02 18230.70 71.21 0.00 0.00 6914.52 2142.52 7208.96 00:28:47.984 =================================================================================================================== 00:28:47.984 Total : 41146.18 160.73 0.00 0.00 12333.06 2142.52 64527.75 00:28:48.243 00:28:48.243 real 0m3.865s 00:28:48.243 user 0m3.591s 00:28:48.243 sys 0m0.234s 00:28:48.243 10:24:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:48.243 10:24:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:48.243 ************************************ 00:28:48.243 END TEST bdev_write_zeroes 00:28:48.243 ************************************ 00:28:48.243 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:48.243 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:48.243 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:48.243 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.243 ************************************ 00:28:48.243 START TEST bdev_json_nonenclosed 00:28:48.243 ************************************ 00:28:48.243 10:24:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:48.503 [2024-06-10 10:24:10.154131] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:48.503 [2024-06-10 10:24:10.154175] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1164389 ] 00:28:48.503 [2024-06-10 10:24:10.241166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.503 [2024-06-10 10:24:10.315960] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.503 [2024-06-10 10:24:10.316018] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:48.503 [2024-06-10 10:24:10.316028] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:48.503 [2024-06-10 10:24:10.316035] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:48.763 00:28:48.763 real 0m0.273s 00:28:48.763 user 0m0.173s 00:28:48.763 sys 0m0.098s 00:28:48.763 10:24:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:48.763 10:24:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:48.763 ************************************ 00:28:48.763 END TEST bdev_json_nonenclosed 00:28:48.763 ************************************ 00:28:48.763 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:48.763 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:48.763 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:48.763 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.763 ************************************ 00:28:48.763 START TEST bdev_json_nonarray 00:28:48.763 ************************************ 00:28:48.763 10:24:10 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:48.763 [2024-06-10 10:24:10.494496] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:48.763 [2024-06-10 10:24:10.494542] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1164538 ] 00:28:48.763 [2024-06-10 10:24:10.582246] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:49.023 [2024-06-10 10:24:10.658843] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:49.023 [2024-06-10 10:24:10.658906] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:49.023 [2024-06-10 10:24:10.658917] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:49.023 [2024-06-10 10:24:10.658924] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:49.023 00:28:49.023 real 0m0.276s 00:28:49.023 user 0m0.165s 00:28:49.023 sys 0m0.109s 00:28:49.023 10:24:10 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:49.023 10:24:10 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:49.023 ************************************ 00:28:49.023 END TEST bdev_json_nonarray 00:28:49.023 ************************************ 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:28:49.023 10:24:10 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:28:49.023 00:28:49.023 real 1m7.005s 00:28:49.023 user 2m50.133s 00:28:49.023 sys 0m5.764s 00:28:49.023 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:49.023 10:24:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:49.023 ************************************ 00:28:49.023 END TEST blockdev_crypto_aesni 00:28:49.023 ************************************ 00:28:49.023 10:24:10 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:28:49.023 10:24:10 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:49.023 10:24:10 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:49.023 10:24:10 -- common/autotest_common.sh@10 -- # set +x 00:28:49.023 ************************************ 00:28:49.023 START TEST blockdev_crypto_sw 00:28:49.023 ************************************ 00:28:49.023 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:28:49.283 * Looking for test storage... 00:28:49.283 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:49.283 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:49.283 10:24:10 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:28:49.283 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:49.283 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:49.283 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1164725 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1164725 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@830 -- # '[' -z 1164725 ']' 00:28:49.284 10:24:10 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:49.284 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:49.284 10:24:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:49.284 [2024-06-10 10:24:11.018588] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:49.284 [2024-06-10 10:24:11.018651] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1164725 ] 00:28:49.284 [2024-06-10 10:24:11.109083] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:49.545 [2024-06-10 10:24:11.178626] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:50.116 10:24:11 blockdev_crypto_sw -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:50.116 10:24:11 blockdev_crypto_sw -- common/autotest_common.sh@863 -- # return 0 00:28:50.116 10:24:11 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:50.116 10:24:11 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:28:50.116 10:24:11 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:28:50.116 10:24:11 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.116 10:24:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 Malloc0 00:28:50.377 Malloc1 00:28:50.377 true 00:28:50.377 true 00:28:50.377 true 00:28:50.377 [2024-06-10 10:24:12.024048] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:50.377 crypto_ram 00:28:50.377 [2024-06-10 10:24:12.032072] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:50.377 crypto_ram2 00:28:50.377 [2024-06-10 10:24:12.040092] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:50.377 crypto_ram3 00:28:50.377 [ 00:28:50.377 { 00:28:50.377 "name": "Malloc1", 00:28:50.377 "aliases": [ 00:28:50.377 "0ec10496-56a3-40be-894b-131fa298192b" 00:28:50.377 ], 00:28:50.377 "product_name": "Malloc disk", 00:28:50.377 "block_size": 4096, 00:28:50.377 "num_blocks": 4096, 00:28:50.377 "uuid": "0ec10496-56a3-40be-894b-131fa298192b", 00:28:50.377 "assigned_rate_limits": { 00:28:50.377 "rw_ios_per_sec": 0, 00:28:50.377 "rw_mbytes_per_sec": 0, 00:28:50.377 "r_mbytes_per_sec": 0, 00:28:50.377 "w_mbytes_per_sec": 0 00:28:50.377 }, 00:28:50.377 "claimed": true, 00:28:50.377 "claim_type": "exclusive_write", 00:28:50.377 "zoned": false, 00:28:50.377 "supported_io_types": { 00:28:50.377 "read": true, 00:28:50.377 "write": true, 00:28:50.377 "unmap": true, 00:28:50.377 "write_zeroes": true, 00:28:50.377 "flush": true, 00:28:50.377 "reset": true, 00:28:50.377 "compare": false, 00:28:50.377 "compare_and_write": false, 00:28:50.377 "abort": true, 00:28:50.377 "nvme_admin": false, 00:28:50.377 "nvme_io": false 00:28:50.377 }, 00:28:50.377 "memory_domains": [ 00:28:50.377 { 00:28:50.377 "dma_device_id": "system", 00:28:50.377 "dma_device_type": 1 00:28:50.377 }, 00:28:50.377 { 00:28:50.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:50.377 "dma_device_type": 2 00:28:50.377 } 00:28:50.377 ], 00:28:50.377 "driver_specific": {} 00:28:50.377 } 00:28:50.377 ] 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.377 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "47c4a432-3bd4-5aa3-8051-de1a12d261cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "47c4a432-3bd4-5aa3-8051-de1a12d261cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "47c356d7-bb91-5207-a9be-c27c89d77fc5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "47c356d7-bb91-5207-a9be-c27c89d77fc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:50.377 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:50.378 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:50.378 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:50.378 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1164725 00:28:50.378 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@949 -- # '[' -z 1164725 ']' 00:28:50.378 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # kill -0 1164725 00:28:50.378 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # uname 00:28:50.378 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:50.378 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1164725 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1164725' 00:28:50.637 killing process with pid 1164725 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # kill 1164725 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@973 -- # wait 1164725 00:28:50.637 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:50.637 10:24:12 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:50.637 10:24:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:50.897 ************************************ 00:28:50.897 START TEST bdev_hello_world 00:28:50.897 ************************************ 00:28:50.897 10:24:12 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:50.897 [2024-06-10 10:24:12.565689] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:50.897 [2024-06-10 10:24:12.565731] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1165249 ] 00:28:50.897 [2024-06-10 10:24:12.652648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:50.897 [2024-06-10 10:24:12.718671] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:51.157 [2024-06-10 10:24:12.862713] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:51.157 [2024-06-10 10:24:12.862763] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:51.157 [2024-06-10 10:24:12.862771] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.157 [2024-06-10 10:24:12.870730] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:51.157 [2024-06-10 10:24:12.870747] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:51.157 [2024-06-10 10:24:12.870753] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.157 [2024-06-10 10:24:12.878751] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:51.157 [2024-06-10 10:24:12.878760] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:51.157 [2024-06-10 10:24:12.878766] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.157 [2024-06-10 10:24:12.915777] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:51.157 [2024-06-10 10:24:12.915800] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:51.157 [2024-06-10 10:24:12.915810] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:51.157 [2024-06-10 10:24:12.917138] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:51.157 [2024-06-10 10:24:12.917198] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:51.157 [2024-06-10 10:24:12.917207] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:51.157 [2024-06-10 10:24:12.917231] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:51.157 00:28:51.157 [2024-06-10 10:24:12.917240] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:51.418 00:28:51.418 real 0m0.521s 00:28:51.418 user 0m0.352s 00:28:51.418 sys 0m0.153s 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:51.418 ************************************ 00:28:51.418 END TEST bdev_hello_world 00:28:51.418 ************************************ 00:28:51.418 10:24:13 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:51.418 10:24:13 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:51.418 10:24:13 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:51.418 10:24:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:51.418 ************************************ 00:28:51.418 START TEST bdev_bounds 00:28:51.418 ************************************ 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1165360 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1165360' 00:28:51.418 Process bdevio pid: 1165360 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1165360 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1165360 ']' 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:51.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:51.418 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:51.418 [2024-06-10 10:24:13.168400] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:51.418 [2024-06-10 10:24:13.168450] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1165360 ] 00:28:51.418 [2024-06-10 10:24:13.256952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:51.678 [2024-06-10 10:24:13.331927] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:51.678 [2024-06-10 10:24:13.332057] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:51.678 [2024-06-10 10:24:13.332061] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:51.678 [2024-06-10 10:24:13.472924] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:51.678 [2024-06-10 10:24:13.472977] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:51.678 [2024-06-10 10:24:13.472985] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.678 [2024-06-10 10:24:13.480943] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:51.678 [2024-06-10 10:24:13.480953] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:51.678 [2024-06-10 10:24:13.480959] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.678 [2024-06-10 10:24:13.488963] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:51.678 [2024-06-10 10:24:13.488974] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:51.678 [2024-06-10 10:24:13.488979] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:52.248 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:52.248 10:24:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:28:52.248 10:24:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:52.248 I/O targets: 00:28:52.248 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:28:52.248 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:28:52.248 00:28:52.248 00:28:52.248 CUnit - A unit testing framework for C - Version 2.1-3 00:28:52.248 http://cunit.sourceforge.net/ 00:28:52.248 00:28:52.248 00:28:52.248 Suite: bdevio tests on: crypto_ram3 00:28:52.248 Test: blockdev write read block ...passed 00:28:52.248 Test: blockdev write zeroes read block ...passed 00:28:52.248 Test: blockdev write zeroes read no split ...passed 00:28:52.248 Test: blockdev write zeroes read split ...passed 00:28:52.248 Test: blockdev write zeroes read split partial ...passed 00:28:52.248 Test: blockdev reset ...passed 00:28:52.248 Test: blockdev write read 8 blocks ...passed 00:28:52.248 Test: blockdev write read size > 128k ...passed 00:28:52.248 Test: blockdev write read invalid size ...passed 00:28:52.248 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.248 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.248 Test: blockdev write read max offset ...passed 00:28:52.248 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.248 Test: blockdev writev readv 8 blocks ...passed 00:28:52.248 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.248 Test: blockdev writev readv block ...passed 00:28:52.248 Test: blockdev writev readv size > 128k ...passed 00:28:52.248 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.248 Test: blockdev comparev and writev ...passed 00:28:52.248 Test: blockdev nvme passthru rw ...passed 00:28:52.248 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.248 Test: blockdev nvme admin passthru ...passed 00:28:52.248 Test: blockdev copy ...passed 00:28:52.248 Suite: bdevio tests on: crypto_ram 00:28:52.248 Test: blockdev write read block ...passed 00:28:52.249 Test: blockdev write zeroes read block ...passed 00:28:52.249 Test: blockdev write zeroes read no split ...passed 00:28:52.249 Test: blockdev write zeroes read split ...passed 00:28:52.249 Test: blockdev write zeroes read split partial ...passed 00:28:52.249 Test: blockdev reset ...passed 00:28:52.249 Test: blockdev write read 8 blocks ...passed 00:28:52.249 Test: blockdev write read size > 128k ...passed 00:28:52.249 Test: blockdev write read invalid size ...passed 00:28:52.249 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.249 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.249 Test: blockdev write read max offset ...passed 00:28:52.249 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.249 Test: blockdev writev readv 8 blocks ...passed 00:28:52.249 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.249 Test: blockdev writev readv block ...passed 00:28:52.249 Test: blockdev writev readv size > 128k ...passed 00:28:52.249 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.249 Test: blockdev comparev and writev ...passed 00:28:52.249 Test: blockdev nvme passthru rw ...passed 00:28:52.249 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.249 Test: blockdev nvme admin passthru ...passed 00:28:52.249 Test: blockdev copy ...passed 00:28:52.249 00:28:52.249 Run Summary: Type Total Ran Passed Failed Inactive 00:28:52.249 suites 2 2 n/a 0 0 00:28:52.249 tests 46 46 46 0 0 00:28:52.249 asserts 260 260 260 0 n/a 00:28:52.249 00:28:52.249 Elapsed time = 0.060 seconds 00:28:52.249 0 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1165360 ']' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1165360' 00:28:52.539 killing process with pid 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1165360 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:52.539 00:28:52.539 real 0m1.180s 00:28:52.539 user 0m3.187s 00:28:52.539 sys 0m0.276s 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:52.539 ************************************ 00:28:52.539 END TEST bdev_bounds 00:28:52.539 ************************************ 00:28:52.539 10:24:14 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:28:52.539 10:24:14 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:52.539 10:24:14 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:52.539 10:24:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:52.539 ************************************ 00:28:52.539 START TEST bdev_nbd 00:28:52.539 ************************************ 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1165615 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1165615 /var/tmp/spdk-nbd.sock 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1165615 ']' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:52.539 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:52.539 10:24:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:52.799 [2024-06-10 10:24:14.433660] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:28:52.799 [2024-06-10 10:24:14.433706] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:52.799 [2024-06-10 10:24:14.521140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:52.799 [2024-06-10 10:24:14.584557] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.059 [2024-06-10 10:24:14.728892] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:53.059 [2024-06-10 10:24:14.728936] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:53.059 [2024-06-10 10:24:14.728944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:53.059 [2024-06-10 10:24:14.736909] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:53.059 [2024-06-10 10:24:14.736920] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:53.059 [2024-06-10 10:24:14.736925] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:53.059 [2024-06-10 10:24:14.744929] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:53.059 [2024-06-10 10:24:14.744939] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:53.059 [2024-06-10 10:24:14.744945] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:53.627 1+0 records in 00:28:53.627 1+0 records out 00:28:53.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293737 s, 13.9 MB/s 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.627 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:53.887 1+0 records in 00:28:53.887 1+0 records out 00:28:53.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277451 s, 14.8 MB/s 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:53.887 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:54.147 { 00:28:54.147 "nbd_device": "/dev/nbd0", 00:28:54.147 "bdev_name": "crypto_ram" 00:28:54.147 }, 00:28:54.147 { 00:28:54.147 "nbd_device": "/dev/nbd1", 00:28:54.147 "bdev_name": "crypto_ram3" 00:28:54.147 } 00:28:54.147 ]' 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:54.147 { 00:28:54.147 "nbd_device": "/dev/nbd0", 00:28:54.147 "bdev_name": "crypto_ram" 00:28:54.147 }, 00:28:54.147 { 00:28:54.147 "nbd_device": "/dev/nbd1", 00:28:54.147 "bdev_name": "crypto_ram3" 00:28:54.147 } 00:28:54.147 ]' 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:54.147 10:24:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:54.408 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:54.668 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:54.928 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:55.188 /dev/nbd0 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:55.188 1+0 records in 00:28:55.188 1+0 records out 00:28:55.188 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261559 s, 15.7 MB/s 00:28:55.188 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:55.189 10:24:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:28:55.189 /dev/nbd1 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:55.450 1+0 records in 00:28:55.450 1+0 records out 00:28:55.450 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333936 s, 12.3 MB/s 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:28:55.450 { 00:28:55.450 "nbd_device": "/dev/nbd0", 00:28:55.450 "bdev_name": "crypto_ram" 00:28:55.450 }, 00:28:55.450 { 00:28:55.450 "nbd_device": "/dev/nbd1", 00:28:55.450 "bdev_name": "crypto_ram3" 00:28:55.450 } 00:28:55.450 ]' 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:28:55.450 { 00:28:55.450 "nbd_device": "/dev/nbd0", 00:28:55.450 "bdev_name": "crypto_ram" 00:28:55.450 }, 00:28:55.450 { 00:28:55.450 "nbd_device": "/dev/nbd1", 00:28:55.450 "bdev_name": "crypto_ram3" 00:28:55.450 } 00:28:55.450 ]' 00:28:55.450 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:28:55.711 /dev/nbd1' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:28:55.711 /dev/nbd1' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:28:55.711 256+0 records in 00:28:55.711 256+0 records out 00:28:55.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0117004 s, 89.6 MB/s 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:28:55.711 256+0 records in 00:28:55.711 256+0 records out 00:28:55.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0163824 s, 64.0 MB/s 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:28:55.711 256+0 records in 00:28:55.711 256+0 records out 00:28:55.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0254657 s, 41.2 MB/s 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:55.711 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:55.972 10:24:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:56.232 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:56.232 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:56.232 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:56.232 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:28:56.233 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:28:56.493 malloc_lvol_verify 00:28:56.493 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:28:56.754 2163ffac-1f78-4798-bb2c-03caa9c5f5e4 00:28:56.754 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:57.013 96cf2b53-1ae2-453b-85d2-49f4103d3141 00:28:57.013 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:57.013 /dev/nbd0 00:28:57.013 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:57.013 mke2fs 1.46.5 (30-Dec-2021) 00:28:57.013 Discarding device blocks: 0/4096 done 00:28:57.013 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:57.013 00:28:57.013 Allocating group tables: 0/1 done 00:28:57.013 Writing inode tables: 0/1 done 00:28:57.013 Creating journal (1024 blocks): done 00:28:57.013 Writing superblocks and filesystem accounting information: 0/1 done 00:28:57.013 00:28:57.013 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:57.013 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:57.013 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:57.014 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:57.014 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:57.014 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:57.014 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:57.014 10:24:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1165615 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1165615 ']' 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1165615 00:28:57.273 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1165615 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1165615' 00:28:57.274 killing process with pid 1165615 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1165615 00:28:57.274 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1165615 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:57.534 00:28:57.534 real 0m4.838s 00:28:57.534 user 0m7.270s 00:28:57.534 sys 0m1.423s 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:57.534 ************************************ 00:28:57.534 END TEST bdev_nbd 00:28:57.534 ************************************ 00:28:57.534 10:24:19 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:57.534 10:24:19 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:28:57.534 10:24:19 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:28:57.534 10:24:19 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:57.534 10:24:19 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:57.534 10:24:19 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:57.534 10:24:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:57.534 ************************************ 00:28:57.534 START TEST bdev_fio 00:28:57.534 ************************************ 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:57.534 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:57.534 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:57.535 ************************************ 00:28:57.535 START TEST bdev_fio_rw_verify 00:28:57.535 ************************************ 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:28:57.535 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:57.795 10:24:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:58.054 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:58.054 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:58.054 fio-3.35 00:28:58.054 Starting 2 threads 00:29:10.277 00:29:10.277 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1166823: Mon Jun 10 10:24:30 2024 00:29:10.277 read: IOPS=31.3k, BW=122MiB/s (128MB/s)(1224MiB/10001msec) 00:29:10.277 slat (nsec): min=8621, max=50264, avg=13358.38, stdev=3439.52 00:29:10.277 clat (usec): min=4, max=307, avg=100.52, stdev=41.79 00:29:10.277 lat (usec): min=13, max=323, avg=113.88, stdev=43.10 00:29:10.277 clat percentiles (usec): 00:29:10.277 | 50.000th=[ 96], 99.000th=[ 196], 99.900th=[ 221], 99.990th=[ 247], 00:29:10.277 | 99.999th=[ 273] 00:29:10.277 write: IOPS=37.7k, BW=147MiB/s (154MB/s)(1394MiB/9477msec); 0 zone resets 00:29:10.277 slat (usec): min=8, max=505, avg=23.55, stdev= 4.67 00:29:10.277 clat (usec): min=16, max=927, avg=136.77, stdev=65.41 00:29:10.277 lat (usec): min=33, max=1091, avg=160.32, stdev=67.30 00:29:10.277 clat percentiles (usec): 00:29:10.277 | 50.000th=[ 130], 99.000th=[ 281], 99.900th=[ 314], 99.990th=[ 553], 00:29:10.277 | 99.999th=[ 742] 00:29:10.277 bw ( KiB/s): min=129648, max=165952, per=94.95%, avg=143060.21, stdev=6647.28, samples=38 00:29:10.277 iops : min=32412, max=41488, avg=35765.05, stdev=1661.82, samples=38 00:29:10.277 lat (usec) : 10=0.01%, 20=0.01%, 50=10.52%, 100=32.03%, 250=54.00% 00:29:10.277 lat (usec) : 500=3.44%, 750=0.01%, 1000=0.01% 00:29:10.277 cpu : usr=99.72%, sys=0.00%, ctx=29, majf=0, minf=458 00:29:10.277 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:10.277 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:10.277 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:10.277 issued rwts: total=313400,356984,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:10.277 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:10.277 00:29:10.277 Run status group 0 (all jobs): 00:29:10.277 READ: bw=122MiB/s (128MB/s), 122MiB/s-122MiB/s (128MB/s-128MB/s), io=1224MiB (1284MB), run=10001-10001msec 00:29:10.277 WRITE: bw=147MiB/s (154MB/s), 147MiB/s-147MiB/s (154MB/s-154MB/s), io=1394MiB (1462MB), run=9477-9477msec 00:29:10.277 00:29:10.277 real 0m10.951s 00:29:10.277 user 0m26.696s 00:29:10.277 sys 0m0.304s 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:10.277 ************************************ 00:29:10.277 END TEST bdev_fio_rw_verify 00:29:10.277 ************************************ 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "47c4a432-3bd4-5aa3-8051-de1a12d261cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "47c4a432-3bd4-5aa3-8051-de1a12d261cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "47c356d7-bb91-5207-a9be-c27c89d77fc5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "47c356d7-bb91-5207-a9be-c27c89d77fc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:10.277 crypto_ram3 ]] 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "47c4a432-3bd4-5aa3-8051-de1a12d261cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "47c4a432-3bd4-5aa3-8051-de1a12d261cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "47c356d7-bb91-5207-a9be-c27c89d77fc5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "47c356d7-bb91-5207-a9be-c27c89d77fc5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:10.277 ************************************ 00:29:10.277 START TEST bdev_fio_trim 00:29:10.277 ************************************ 00:29:10.277 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:10.278 10:24:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:10.278 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:10.278 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:10.278 fio-3.35 00:29:10.278 Starting 2 threads 00:29:20.267 00:29:20.267 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1168823: Mon Jun 10 10:24:41 2024 00:29:20.267 write: IOPS=58.5k, BW=228MiB/s (240MB/s)(2285MiB/10001msec); 0 zone resets 00:29:20.267 slat (usec): min=10, max=307, avg=14.68, stdev= 4.43 00:29:20.267 clat (usec): min=25, max=1341, avg=113.59, stdev=62.63 00:29:20.267 lat (usec): min=36, max=1358, avg=128.28, stdev=65.02 00:29:20.267 clat percentiles (usec): 00:29:20.267 | 50.000th=[ 91], 99.000th=[ 239], 99.900th=[ 273], 99.990th=[ 510], 00:29:20.267 | 99.999th=[ 766] 00:29:20.267 bw ( KiB/s): min=206408, max=240896, per=99.93%, avg=233768.00, stdev=3584.47, samples=38 00:29:20.267 iops : min=51602, max=60224, avg=58442.00, stdev=896.12, samples=38 00:29:20.267 trim: IOPS=58.5k, BW=228MiB/s (240MB/s)(2285MiB/10001msec); 0 zone resets 00:29:20.267 slat (usec): min=4, max=438, avg= 6.65, stdev= 2.36 00:29:20.267 clat (usec): min=35, max=1225, avg=76.04, stdev=22.99 00:29:20.267 lat (usec): min=41, max=1233, avg=82.69, stdev=23.10 00:29:20.267 clat percentiles (usec): 00:29:20.267 | 50.000th=[ 77], 99.000th=[ 128], 99.900th=[ 149], 99.990th=[ 281], 00:29:20.267 | 99.999th=[ 529] 00:29:20.267 bw ( KiB/s): min=206432, max=240896, per=99.94%, avg=233769.68, stdev=3582.57, samples=38 00:29:20.267 iops : min=51608, max=60224, avg=58442.42, stdev=895.64, samples=38 00:29:20.267 lat (usec) : 50=16.38%, 100=53.51%, 250=29.90%, 500=0.19%, 750=0.01% 00:29:20.267 lat (usec) : 1000=0.01% 00:29:20.267 lat (msec) : 2=0.01% 00:29:20.267 cpu : usr=99.73%, sys=0.00%, ctx=30, majf=0, minf=284 00:29:20.267 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:20.267 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:20.267 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:20.267 issued rwts: total=0,584859,584859,0 short=0,0,0,0 dropped=0,0,0,0 00:29:20.267 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:20.267 00:29:20.267 Run status group 0 (all jobs): 00:29:20.267 WRITE: bw=228MiB/s (240MB/s), 228MiB/s-228MiB/s (240MB/s-240MB/s), io=2285MiB (2396MB), run=10001-10001msec 00:29:20.267 TRIM: bw=228MiB/s (240MB/s), 228MiB/s-228MiB/s (240MB/s-240MB/s), io=2285MiB (2396MB), run=10001-10001msec 00:29:20.267 00:29:20.267 real 0m11.023s 00:29:20.267 user 0m27.333s 00:29:20.267 sys 0m0.298s 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:20.267 ************************************ 00:29:20.267 END TEST bdev_fio_trim 00:29:20.267 ************************************ 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:20.267 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:20.267 00:29:20.267 real 0m22.318s 00:29:20.267 user 0m54.208s 00:29:20.267 sys 0m0.783s 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:20.267 10:24:41 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:20.268 ************************************ 00:29:20.268 END TEST bdev_fio 00:29:20.268 ************************************ 00:29:20.268 10:24:41 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:20.268 10:24:41 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:20.268 10:24:41 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:29:20.268 10:24:41 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:20.268 10:24:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.268 ************************************ 00:29:20.268 START TEST bdev_verify 00:29:20.268 ************************************ 00:29:20.268 10:24:41 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:20.268 [2024-06-10 10:24:41.732908] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:20.268 [2024-06-10 10:24:41.732950] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1170654 ] 00:29:20.268 [2024-06-10 10:24:41.819860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:20.268 [2024-06-10 10:24:41.885840] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:20.268 [2024-06-10 10:24:41.886010] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:20.268 [2024-06-10 10:24:42.025232] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:20.268 [2024-06-10 10:24:42.025280] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:20.268 [2024-06-10 10:24:42.025289] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:20.268 [2024-06-10 10:24:42.033252] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:20.268 [2024-06-10 10:24:42.033263] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:20.268 [2024-06-10 10:24:42.033269] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:20.268 [2024-06-10 10:24:42.041274] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:20.268 [2024-06-10 10:24:42.041284] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:20.268 [2024-06-10 10:24:42.041290] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:20.268 Running I/O for 5 seconds... 00:29:25.548 00:29:25.548 Latency(us) 00:29:25.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:25.548 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:25.548 Verification LBA range: start 0x0 length 0x800 00:29:25.548 crypto_ram : 5.02 6907.81 26.98 0.00 0.00 18455.06 1310.72 32465.53 00:29:25.548 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:25.548 Verification LBA range: start 0x800 length 0x800 00:29:25.548 crypto_ram : 5.02 6651.01 25.98 0.00 0.00 19153.96 1209.90 32465.53 00:29:25.548 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:25.548 Verification LBA range: start 0x0 length 0x800 00:29:25.548 crypto_ram3 : 5.03 3462.33 13.52 0.00 0.00 36766.66 1461.96 35893.56 00:29:25.548 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:25.548 Verification LBA range: start 0x800 length 0x800 00:29:25.548 crypto_ram3 : 5.03 3334.12 13.02 0.00 0.00 38164.90 1279.21 35893.56 00:29:25.548 =================================================================================================================== 00:29:25.548 Total : 20355.27 79.51 0.00 0.00 25031.75 1209.90 35893.56 00:29:25.548 00:29:25.548 real 0m5.579s 00:29:25.548 user 0m10.688s 00:29:25.548 sys 0m0.157s 00:29:25.548 10:24:47 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:25.548 10:24:47 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:25.548 ************************************ 00:29:25.548 END TEST bdev_verify 00:29:25.548 ************************************ 00:29:25.548 10:24:47 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:25.548 10:24:47 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:29:25.548 10:24:47 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:25.548 10:24:47 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:25.548 ************************************ 00:29:25.548 START TEST bdev_verify_big_io 00:29:25.548 ************************************ 00:29:25.548 10:24:47 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:25.548 [2024-06-10 10:24:47.389975] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:25.548 [2024-06-10 10:24:47.390017] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1171584 ] 00:29:25.808 [2024-06-10 10:24:47.474913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:25.808 [2024-06-10 10:24:47.537800] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:25.808 [2024-06-10 10:24:47.537805] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.068 [2024-06-10 10:24:47.682867] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:26.068 [2024-06-10 10:24:47.682919] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:26.068 [2024-06-10 10:24:47.682928] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:26.068 [2024-06-10 10:24:47.690892] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:26.068 [2024-06-10 10:24:47.690904] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:26.068 [2024-06-10 10:24:47.690910] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:26.068 [2024-06-10 10:24:47.698909] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:26.068 [2024-06-10 10:24:47.698920] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:26.068 [2024-06-10 10:24:47.698925] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:26.068 Running I/O for 5 seconds... 00:29:31.344 00:29:31.344 Latency(us) 00:29:31.344 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:31.344 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.344 Verification LBA range: start 0x0 length 0x80 00:29:31.344 crypto_ram : 5.06 480.35 30.02 0.00 0.00 259814.00 3806.13 350063.06 00:29:31.344 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.344 Verification LBA range: start 0x80 length 0x80 00:29:31.344 crypto_ram : 5.09 477.75 29.86 0.00 0.00 261111.50 3780.92 358129.03 00:29:31.344 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.344 Verification LBA range: start 0x0 length 0x80 00:29:31.344 crypto_ram3 : 5.29 290.37 18.15 0.00 0.00 414247.01 3100.36 362968.62 00:29:31.344 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.344 Verification LBA range: start 0x80 length 0x80 00:29:31.344 crypto_ram3 : 5.26 267.73 16.73 0.00 0.00 447824.17 3188.58 361355.42 00:29:31.344 =================================================================================================================== 00:29:31.344 Total : 1516.20 94.76 0.00 0.00 324501.87 3100.36 362968.62 00:29:31.344 00:29:31.344 real 0m5.844s 00:29:31.344 user 0m11.229s 00:29:31.344 sys 0m0.147s 00:29:31.344 10:24:53 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:31.344 10:24:53 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:31.344 ************************************ 00:29:31.344 END TEST bdev_verify_big_io 00:29:31.344 ************************************ 00:29:31.604 10:24:53 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:31.604 10:24:53 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:31.604 10:24:53 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:31.604 10:24:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:31.604 ************************************ 00:29:31.604 START TEST bdev_write_zeroes 00:29:31.604 ************************************ 00:29:31.604 10:24:53 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:31.604 [2024-06-10 10:24:53.312780] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:31.604 [2024-06-10 10:24:53.312836] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1172517 ] 00:29:31.604 [2024-06-10 10:24:53.400597] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:31.863 [2024-06-10 10:24:53.470343] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.863 [2024-06-10 10:24:53.608117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:31.863 [2024-06-10 10:24:53.608167] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:31.863 [2024-06-10 10:24:53.608175] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:31.863 [2024-06-10 10:24:53.616144] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:31.863 [2024-06-10 10:24:53.616155] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:31.863 [2024-06-10 10:24:53.616161] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:31.863 [2024-06-10 10:24:53.624156] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:31.863 [2024-06-10 10:24:53.624166] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:31.863 [2024-06-10 10:24:53.624172] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:31.863 Running I/O for 1 seconds... 00:29:33.247 00:29:33.247 Latency(us) 00:29:33.247 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:33.247 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:33.247 crypto_ram : 1.01 33190.37 129.65 0.00 0.00 3847.31 1701.42 5394.12 00:29:33.247 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:33.247 crypto_ram3 : 1.01 16623.26 64.93 0.00 0.00 7652.37 2797.88 7914.73 00:29:33.247 =================================================================================================================== 00:29:33.247 Total : 49813.63 194.58 0.00 0.00 5118.90 1701.42 7914.73 00:29:33.247 00:29:33.247 real 0m1.542s 00:29:33.247 user 0m1.378s 00:29:33.247 sys 0m0.149s 00:29:33.247 10:24:54 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:33.247 10:24:54 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:33.247 ************************************ 00:29:33.247 END TEST bdev_write_zeroes 00:29:33.247 ************************************ 00:29:33.247 10:24:54 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:33.247 10:24:54 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:33.247 10:24:54 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:33.247 10:24:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:33.247 ************************************ 00:29:33.247 START TEST bdev_json_nonenclosed 00:29:33.247 ************************************ 00:29:33.247 10:24:54 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:33.247 [2024-06-10 10:24:54.920871] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:33.247 [2024-06-10 10:24:54.920915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1172835 ] 00:29:33.247 [2024-06-10 10:24:55.008463] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:33.247 [2024-06-10 10:24:55.080365] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:33.247 [2024-06-10 10:24:55.080418] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:33.247 [2024-06-10 10:24:55.080429] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:33.247 [2024-06-10 10:24:55.080435] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:33.508 00:29:33.508 real 0m0.271s 00:29:33.508 user 0m0.165s 00:29:33.508 sys 0m0.105s 00:29:33.508 10:24:55 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:33.508 10:24:55 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:33.508 ************************************ 00:29:33.508 END TEST bdev_json_nonenclosed 00:29:33.508 ************************************ 00:29:33.508 10:24:55 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:33.508 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:33.509 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:33.509 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:33.509 ************************************ 00:29:33.509 START TEST bdev_json_nonarray 00:29:33.509 ************************************ 00:29:33.509 10:24:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:33.509 [2024-06-10 10:24:55.270212] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:33.509 [2024-06-10 10:24:55.270261] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1172862 ] 00:29:33.509 [2024-06-10 10:24:55.358224] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:33.770 [2024-06-10 10:24:55.435771] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:33.770 [2024-06-10 10:24:55.435835] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:33.770 [2024-06-10 10:24:55.435846] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:33.770 [2024-06-10 10:24:55.435853] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:33.770 00:29:33.770 real 0m0.289s 00:29:33.770 user 0m0.174s 00:29:33.770 sys 0m0.113s 00:29:33.770 10:24:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:33.770 10:24:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:33.770 ************************************ 00:29:33.770 END TEST bdev_json_nonarray 00:29:33.770 ************************************ 00:29:33.770 10:24:55 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:29:33.770 10:24:55 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:29:33.770 10:24:55 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:29:33.770 10:24:55 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:29:33.770 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:29:33.770 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:33.770 10:24:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:33.770 ************************************ 00:29:33.770 START TEST bdev_crypto_enomem 00:29:33.770 ************************************ 00:29:33.770 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # bdev_crypto_enomem 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1172897 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1172897 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@830 -- # '[' -z 1172897 ']' 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:33.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:33.771 10:24:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:29:33.771 [2024-06-10 10:24:55.615691] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:33.771 [2024-06-10 10:24:55.615736] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1172897 ] 00:29:34.067 [2024-06-10 10:24:55.685560] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:34.067 [2024-06-10 10:24:55.751400] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@863 -- # return 0 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:34.674 true 00:29:34.674 base0 00:29:34.674 true 00:29:34.674 [2024-06-10 10:24:56.479261] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:34.674 crypt0 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_name=crypt0 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local i 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:34.674 [ 00:29:34.674 { 00:29:34.674 "name": "crypt0", 00:29:34.674 "aliases": [ 00:29:34.674 "152b522f-f0d4-5918-b5e8-cc3a82c0541d" 00:29:34.674 ], 00:29:34.674 "product_name": "crypto", 00:29:34.674 "block_size": 512, 00:29:34.674 "num_blocks": 2097152, 00:29:34.674 "uuid": "152b522f-f0d4-5918-b5e8-cc3a82c0541d", 00:29:34.674 "assigned_rate_limits": { 00:29:34.674 "rw_ios_per_sec": 0, 00:29:34.674 "rw_mbytes_per_sec": 0, 00:29:34.674 "r_mbytes_per_sec": 0, 00:29:34.674 "w_mbytes_per_sec": 0 00:29:34.674 }, 00:29:34.674 "claimed": false, 00:29:34.674 "zoned": false, 00:29:34.674 "supported_io_types": { 00:29:34.674 "read": true, 00:29:34.674 "write": true, 00:29:34.674 "unmap": false, 00:29:34.674 "write_zeroes": true, 00:29:34.674 "flush": false, 00:29:34.674 "reset": true, 00:29:34.674 "compare": false, 00:29:34.674 "compare_and_write": false, 00:29:34.674 "abort": false, 00:29:34.674 "nvme_admin": false, 00:29:34.674 "nvme_io": false 00:29:34.674 }, 00:29:34.674 "memory_domains": [ 00:29:34.674 { 00:29:34.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.674 "dma_device_type": 2 00:29:34.674 } 00:29:34.674 ], 00:29:34.674 "driver_specific": { 00:29:34.674 "crypto": { 00:29:34.674 "base_bdev_name": "EE_base0", 00:29:34.674 "name": "crypt0", 00:29:34.674 "key_name": "test_dek_sw" 00:29:34.674 } 00:29:34.674 } 00:29:34.674 } 00:29:34.674 ] 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # return 0 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1173184 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:29:34.674 10:24:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:34.934 Running I/O for 5 seconds... 00:29:35.873 10:24:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:29:35.873 10:24:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:35.873 10:24:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:35.873 10:24:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:35.873 10:24:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1173184 00:29:40.073 00:29:40.073 Latency(us) 00:29:40.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:40.073 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:29:40.073 crypt0 : 5.00 45329.11 177.07 0.00 0.00 702.85 348.16 1172.09 00:29:40.073 =================================================================================================================== 00:29:40.073 Total : 45329.11 177.07 0.00 0.00 702.85 348.16 1172.09 00:29:40.073 0 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1172897 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@949 -- # '[' -z 1172897 ']' 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # kill -0 1172897 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # uname 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1172897 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1172897' 00:29:40.074 killing process with pid 1172897 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # kill 1172897 00:29:40.074 Received shutdown signal, test time was about 5.000000 seconds 00:29:40.074 00:29:40.074 Latency(us) 00:29:40.074 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:40.074 =================================================================================================================== 00:29:40.074 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@973 -- # wait 1172897 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:29:40.074 00:29:40.074 real 0m6.245s 00:29:40.074 user 0m6.515s 00:29:40.074 sys 0m0.246s 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:40.074 10:25:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:40.074 ************************************ 00:29:40.074 END TEST bdev_crypto_enomem 00:29:40.074 ************************************ 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:29:40.074 10:25:01 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:29:40.074 00:29:40.074 real 0m51.020s 00:29:40.074 user 1m37.262s 00:29:40.074 sys 0m4.474s 00:29:40.074 10:25:01 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:40.074 10:25:01 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:40.074 ************************************ 00:29:40.074 END TEST blockdev_crypto_sw 00:29:40.074 ************************************ 00:29:40.074 10:25:01 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:29:40.074 10:25:01 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:29:40.074 10:25:01 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:40.074 10:25:01 -- common/autotest_common.sh@10 -- # set +x 00:29:40.074 ************************************ 00:29:40.074 START TEST blockdev_crypto_qat 00:29:40.074 ************************************ 00:29:40.074 10:25:01 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:29:40.335 * Looking for test storage... 00:29:40.335 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1174159 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1174159 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@830 -- # '[' -z 1174159 ']' 00:29:40.335 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:40.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:40.335 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:40.335 [2024-06-10 10:25:02.105356] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:40.335 [2024-06-10 10:25:02.105429] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1174159 ] 00:29:40.335 [2024-06-10 10:25:02.182227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:40.595 [2024-06-10 10:25:02.275394] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:41.165 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:41.165 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@863 -- # return 0 00:29:41.165 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:29:41.165 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:29:41.165 10:25:02 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:29:41.165 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:41.165 10:25:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:41.165 [2024-06-10 10:25:02.989488] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:41.165 [2024-06-10 10:25:02.997523] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:41.165 [2024-06-10 10:25:03.005535] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:41.425 [2024-06-10 10:25:03.067780] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:43.970 true 00:29:43.970 true 00:29:43.970 true 00:29:43.970 true 00:29:43.970 Malloc0 00:29:43.970 Malloc1 00:29:43.970 Malloc2 00:29:43.970 Malloc3 00:29:43.970 [2024-06-10 10:25:05.445606] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:43.970 crypto_ram 00:29:43.970 [2024-06-10 10:25:05.453625] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:43.970 crypto_ram1 00:29:43.970 [2024-06-10 10:25:05.461646] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:43.970 crypto_ram2 00:29:43.970 [2024-06-10 10:25:05.469668] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:43.970 crypto_ram3 00:29:43.970 [ 00:29:43.970 { 00:29:43.970 "name": "Malloc1", 00:29:43.970 "aliases": [ 00:29:43.970 "024a60ed-f618-40d0-a4bb-8c2ce49a5182" 00:29:43.970 ], 00:29:43.970 "product_name": "Malloc disk", 00:29:43.970 "block_size": 512, 00:29:43.970 "num_blocks": 65536, 00:29:43.970 "uuid": "024a60ed-f618-40d0-a4bb-8c2ce49a5182", 00:29:43.970 "assigned_rate_limits": { 00:29:43.970 "rw_ios_per_sec": 0, 00:29:43.970 "rw_mbytes_per_sec": 0, 00:29:43.970 "r_mbytes_per_sec": 0, 00:29:43.970 "w_mbytes_per_sec": 0 00:29:43.970 }, 00:29:43.970 "claimed": true, 00:29:43.970 "claim_type": "exclusive_write", 00:29:43.970 "zoned": false, 00:29:43.970 "supported_io_types": { 00:29:43.970 "read": true, 00:29:43.970 "write": true, 00:29:43.971 "unmap": true, 00:29:43.971 "write_zeroes": true, 00:29:43.971 "flush": true, 00:29:43.971 "reset": true, 00:29:43.971 "compare": false, 00:29:43.971 "compare_and_write": false, 00:29:43.971 "abort": true, 00:29:43.971 "nvme_admin": false, 00:29:43.971 "nvme_io": false 00:29:43.971 }, 00:29:43.971 "memory_domains": [ 00:29:43.971 { 00:29:43.971 "dma_device_id": "system", 00:29:43.971 "dma_device_type": 1 00:29:43.971 }, 00:29:43.971 { 00:29:43.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:43.971 "dma_device_type": 2 00:29:43.971 } 00:29:43.971 ], 00:29:43.971 "driver_specific": {} 00:29:43.971 } 00:29:43.971 ] 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0026a223-3ea6-573a-9fe4-b432517699d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0026a223-3ea6-573a-9fe4-b432517699d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0601ff36-398d-5d5a-a510-7802b96ea5a1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0601ff36-398d-5d5a-a510-7802b96ea5a1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f73e9eb5-7de2-5655-bfa9-9bd82616aac1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f73e9eb5-7de2-5655-bfa9-9bd82616aac1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:29:43.971 10:25:05 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1174159 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@949 -- # '[' -z 1174159 ']' 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # kill -0 1174159 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # uname 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1174159 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1174159' 00:29:43.971 killing process with pid 1174159 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # kill 1174159 00:29:43.971 10:25:05 blockdev_crypto_qat -- common/autotest_common.sh@973 -- # wait 1174159 00:29:44.231 10:25:06 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:44.231 10:25:06 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:44.231 10:25:06 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:29:44.231 10:25:06 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:44.231 10:25:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:44.231 ************************************ 00:29:44.231 START TEST bdev_hello_world 00:29:44.231 ************************************ 00:29:44.231 10:25:06 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:44.492 [2024-06-10 10:25:06.120804] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:44.492 [2024-06-10 10:25:06.120856] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1174786 ] 00:29:44.492 [2024-06-10 10:25:06.206095] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:44.492 [2024-06-10 10:25:06.270064] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:44.492 [2024-06-10 10:25:06.291057] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:44.492 [2024-06-10 10:25:06.299086] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:44.492 [2024-06-10 10:25:06.307109] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:44.751 [2024-06-10 10:25:06.391246] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:47.297 [2024-06-10 10:25:08.543867] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:47.297 [2024-06-10 10:25:08.543916] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:47.297 [2024-06-10 10:25:08.543925] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:47.297 [2024-06-10 10:25:08.551884] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:47.297 [2024-06-10 10:25:08.551895] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:47.297 [2024-06-10 10:25:08.551900] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:47.297 [2024-06-10 10:25:08.559905] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:47.297 [2024-06-10 10:25:08.559915] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:47.297 [2024-06-10 10:25:08.559920] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:47.297 [2024-06-10 10:25:08.567924] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:47.297 [2024-06-10 10:25:08.567934] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:47.297 [2024-06-10 10:25:08.567939] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:47.297 [2024-06-10 10:25:08.629202] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:29:47.297 [2024-06-10 10:25:08.629227] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:29:47.297 [2024-06-10 10:25:08.629237] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:29:47.297 [2024-06-10 10:25:08.630273] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:29:47.297 [2024-06-10 10:25:08.630328] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:29:47.297 [2024-06-10 10:25:08.630338] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:29:47.297 [2024-06-10 10:25:08.630368] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:29:47.297 00:29:47.297 [2024-06-10 10:25:08.630379] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:29:47.297 00:29:47.297 real 0m2.802s 00:29:47.297 user 0m2.547s 00:29:47.297 sys 0m0.224s 00:29:47.297 10:25:08 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:47.297 10:25:08 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:29:47.297 ************************************ 00:29:47.297 END TEST bdev_hello_world 00:29:47.297 ************************************ 00:29:47.297 10:25:08 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:29:47.297 10:25:08 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:29:47.297 10:25:08 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:47.297 10:25:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:47.297 ************************************ 00:29:47.297 START TEST bdev_bounds 00:29:47.297 ************************************ 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1175166 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1175166' 00:29:47.298 Process bdevio pid: 1175166 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1175166 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1175166 ']' 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:47.298 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:47.298 10:25:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:47.298 [2024-06-10 10:25:08.995583] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:47.298 [2024-06-10 10:25:08.995631] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1175166 ] 00:29:47.298 [2024-06-10 10:25:09.087091] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:47.298 [2024-06-10 10:25:09.161140] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:47.298 [2024-06-10 10:25:09.161268] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:47.298 [2024-06-10 10:25:09.161271] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:47.559 [2024-06-10 10:25:09.182330] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:47.559 [2024-06-10 10:25:09.190361] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:47.560 [2024-06-10 10:25:09.198377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:47.560 [2024-06-10 10:25:09.284085] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:50.105 [2024-06-10 10:25:11.434947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:50.105 [2024-06-10 10:25:11.435006] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:50.105 [2024-06-10 10:25:11.435015] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:50.105 [2024-06-10 10:25:11.442965] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:50.105 [2024-06-10 10:25:11.442975] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:50.105 [2024-06-10 10:25:11.442981] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:50.105 [2024-06-10 10:25:11.450983] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:50.105 [2024-06-10 10:25:11.450993] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:50.105 [2024-06-10 10:25:11.450999] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:50.105 [2024-06-10 10:25:11.459004] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:50.105 [2024-06-10 10:25:11.459015] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:50.105 [2024-06-10 10:25:11.459020] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:50.106 I/O targets: 00:29:50.106 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:29:50.106 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:29:50.106 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:29:50.106 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:29:50.106 00:29:50.106 00:29:50.106 CUnit - A unit testing framework for C - Version 2.1-3 00:29:50.106 http://cunit.sourceforge.net/ 00:29:50.106 00:29:50.106 00:29:50.106 Suite: bdevio tests on: crypto_ram3 00:29:50.106 Test: blockdev write read block ...passed 00:29:50.106 Test: blockdev write zeroes read block ...passed 00:29:50.106 Test: blockdev write zeroes read no split ...passed 00:29:50.106 Test: blockdev write zeroes read split ...passed 00:29:50.106 Test: blockdev write zeroes read split partial ...passed 00:29:50.106 Test: blockdev reset ...passed 00:29:50.106 Test: blockdev write read 8 blocks ...passed 00:29:50.106 Test: blockdev write read size > 128k ...passed 00:29:50.106 Test: blockdev write read invalid size ...passed 00:29:50.106 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:50.106 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:50.106 Test: blockdev write read max offset ...passed 00:29:50.106 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:50.106 Test: blockdev writev readv 8 blocks ...passed 00:29:50.106 Test: blockdev writev readv 30 x 1block ...passed 00:29:50.106 Test: blockdev writev readv block ...passed 00:29:50.106 Test: blockdev writev readv size > 128k ...passed 00:29:50.106 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:50.106 Test: blockdev comparev and writev ...passed 00:29:50.106 Test: blockdev nvme passthru rw ...passed 00:29:50.106 Test: blockdev nvme passthru vendor specific ...passed 00:29:50.106 Test: blockdev nvme admin passthru ...passed 00:29:50.106 Test: blockdev copy ...passed 00:29:50.106 Suite: bdevio tests on: crypto_ram2 00:29:50.106 Test: blockdev write read block ...passed 00:29:50.106 Test: blockdev write zeroes read block ...passed 00:29:50.106 Test: blockdev write zeroes read no split ...passed 00:29:50.106 Test: blockdev write zeroes read split ...passed 00:29:50.106 Test: blockdev write zeroes read split partial ...passed 00:29:50.106 Test: blockdev reset ...passed 00:29:50.106 Test: blockdev write read 8 blocks ...passed 00:29:50.106 Test: blockdev write read size > 128k ...passed 00:29:50.106 Test: blockdev write read invalid size ...passed 00:29:50.106 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:50.106 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:50.106 Test: blockdev write read max offset ...passed 00:29:50.106 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:50.106 Test: blockdev writev readv 8 blocks ...passed 00:29:50.106 Test: blockdev writev readv 30 x 1block ...passed 00:29:50.106 Test: blockdev writev readv block ...passed 00:29:50.106 Test: blockdev writev readv size > 128k ...passed 00:29:50.106 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:50.106 Test: blockdev comparev and writev ...passed 00:29:50.106 Test: blockdev nvme passthru rw ...passed 00:29:50.106 Test: blockdev nvme passthru vendor specific ...passed 00:29:50.106 Test: blockdev nvme admin passthru ...passed 00:29:50.106 Test: blockdev copy ...passed 00:29:50.106 Suite: bdevio tests on: crypto_ram1 00:29:50.106 Test: blockdev write read block ...passed 00:29:50.106 Test: blockdev write zeroes read block ...passed 00:29:50.106 Test: blockdev write zeroes read no split ...passed 00:29:50.106 Test: blockdev write zeroes read split ...passed 00:29:50.106 Test: blockdev write zeroes read split partial ...passed 00:29:50.106 Test: blockdev reset ...passed 00:29:50.106 Test: blockdev write read 8 blocks ...passed 00:29:50.106 Test: blockdev write read size > 128k ...passed 00:29:50.106 Test: blockdev write read invalid size ...passed 00:29:50.106 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:50.106 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:50.106 Test: blockdev write read max offset ...passed 00:29:50.106 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:50.106 Test: blockdev writev readv 8 blocks ...passed 00:29:50.106 Test: blockdev writev readv 30 x 1block ...passed 00:29:50.106 Test: blockdev writev readv block ...passed 00:29:50.106 Test: blockdev writev readv size > 128k ...passed 00:29:50.106 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:50.106 Test: blockdev comparev and writev ...passed 00:29:50.106 Test: blockdev nvme passthru rw ...passed 00:29:50.106 Test: blockdev nvme passthru vendor specific ...passed 00:29:50.106 Test: blockdev nvme admin passthru ...passed 00:29:50.106 Test: blockdev copy ...passed 00:29:50.106 Suite: bdevio tests on: crypto_ram 00:29:50.106 Test: blockdev write read block ...passed 00:29:50.106 Test: blockdev write zeroes read block ...passed 00:29:50.106 Test: blockdev write zeroes read no split ...passed 00:29:50.106 Test: blockdev write zeroes read split ...passed 00:29:50.106 Test: blockdev write zeroes read split partial ...passed 00:29:50.106 Test: blockdev reset ...passed 00:29:50.106 Test: blockdev write read 8 blocks ...passed 00:29:50.106 Test: blockdev write read size > 128k ...passed 00:29:50.106 Test: blockdev write read invalid size ...passed 00:29:50.106 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:50.106 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:50.106 Test: blockdev write read max offset ...passed 00:29:50.106 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:50.106 Test: blockdev writev readv 8 blocks ...passed 00:29:50.106 Test: blockdev writev readv 30 x 1block ...passed 00:29:50.106 Test: blockdev writev readv block ...passed 00:29:50.106 Test: blockdev writev readv size > 128k ...passed 00:29:50.106 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:50.106 Test: blockdev comparev and writev ...passed 00:29:50.106 Test: blockdev nvme passthru rw ...passed 00:29:50.106 Test: blockdev nvme passthru vendor specific ...passed 00:29:50.106 Test: blockdev nvme admin passthru ...passed 00:29:50.106 Test: blockdev copy ...passed 00:29:50.106 00:29:50.106 Run Summary: Type Total Ran Passed Failed Inactive 00:29:50.106 suites 4 4 n/a 0 0 00:29:50.106 tests 92 92 92 0 0 00:29:50.106 asserts 520 520 520 0 n/a 00:29:50.106 00:29:50.106 Elapsed time = 0.481 seconds 00:29:50.106 0 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1175166 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1175166 ']' 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1175166 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1175166 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1175166' 00:29:50.106 killing process with pid 1175166 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1175166 00:29:50.106 10:25:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1175166 00:29:50.367 10:25:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:29:50.367 00:29:50.367 real 0m3.258s 00:29:50.367 user 0m9.277s 00:29:50.367 sys 0m0.404s 00:29:50.367 10:25:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:50.367 10:25:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:50.367 ************************************ 00:29:50.367 END TEST bdev_bounds 00:29:50.367 ************************************ 00:29:50.627 10:25:12 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:29:50.627 10:25:12 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:50.627 10:25:12 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:50.627 10:25:12 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:50.627 ************************************ 00:29:50.627 START TEST bdev_nbd 00:29:50.627 ************************************ 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1175758 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1175758 /var/tmp/spdk-nbd.sock 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1175758 ']' 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:29:50.627 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:50.627 10:25:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:50.627 [2024-06-10 10:25:12.336232] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:29:50.627 [2024-06-10 10:25:12.336276] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:50.627 [2024-06-10 10:25:12.413455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:50.627 [2024-06-10 10:25:12.475613] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:50.887 [2024-06-10 10:25:12.496622] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:50.887 [2024-06-10 10:25:12.504648] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:50.887 [2024-06-10 10:25:12.512662] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:50.887 [2024-06-10 10:25:12.605027] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:53.432 [2024-06-10 10:25:14.756050] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:53.432 [2024-06-10 10:25:14.756099] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:53.432 [2024-06-10 10:25:14.756107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:53.432 [2024-06-10 10:25:14.764068] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:53.432 [2024-06-10 10:25:14.764079] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:53.432 [2024-06-10 10:25:14.764085] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:53.432 [2024-06-10 10:25:14.772088] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:53.432 [2024-06-10 10:25:14.772099] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:53.432 [2024-06-10 10:25:14.772104] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:53.432 [2024-06-10 10:25:14.780107] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:53.432 [2024-06-10 10:25:14.780119] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:53.432 [2024-06-10 10:25:14.780124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:53.432 10:25:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:53.432 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:53.433 1+0 records in 00:29:53.433 1+0 records out 00:29:53.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281926 s, 14.5 MB/s 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:53.433 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:53.721 1+0 records in 00:29:53.721 1+0 records out 00:29:53.721 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250764 s, 16.3 MB/s 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:53.721 1+0 records in 00:29:53.721 1+0 records out 00:29:53.721 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339873 s, 12.1 MB/s 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:53.721 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:53.982 1+0 records in 00:29:53.982 1+0 records out 00:29:53.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213461 s, 19.2 MB/s 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:53.982 10:25:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd0", 00:29:54.244 "bdev_name": "crypto_ram" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd1", 00:29:54.244 "bdev_name": "crypto_ram1" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd2", 00:29:54.244 "bdev_name": "crypto_ram2" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd3", 00:29:54.244 "bdev_name": "crypto_ram3" 00:29:54.244 } 00:29:54.244 ]' 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd0", 00:29:54.244 "bdev_name": "crypto_ram" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd1", 00:29:54.244 "bdev_name": "crypto_ram1" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd2", 00:29:54.244 "bdev_name": "crypto_ram2" 00:29:54.244 }, 00:29:54.244 { 00:29:54.244 "nbd_device": "/dev/nbd3", 00:29:54.244 "bdev_name": "crypto_ram3" 00:29:54.244 } 00:29:54.244 ]' 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:54.244 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:54.505 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:29:54.767 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:55.028 10:25:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:55.289 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:55.550 /dev/nbd0 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:55.550 1+0 records in 00:29:55.550 1+0 records out 00:29:55.550 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273371 s, 15.0 MB/s 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:55.550 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:29:55.812 /dev/nbd1 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:55.812 1+0 records in 00:29:55.812 1+0 records out 00:29:55.812 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283184 s, 14.5 MB/s 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:55.812 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:29:56.073 /dev/nbd10 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:56.073 1+0 records in 00:29:56.073 1+0 records out 00:29:56.073 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272211 s, 15.0 MB/s 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:56.073 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:29:56.073 /dev/nbd11 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:56.334 1+0 records in 00:29:56.334 1+0 records out 00:29:56.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274409 s, 14.9 MB/s 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:56.334 10:25:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:56.334 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd0", 00:29:56.334 "bdev_name": "crypto_ram" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd1", 00:29:56.334 "bdev_name": "crypto_ram1" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd10", 00:29:56.334 "bdev_name": "crypto_ram2" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd11", 00:29:56.334 "bdev_name": "crypto_ram3" 00:29:56.334 } 00:29:56.334 ]' 00:29:56.334 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd0", 00:29:56.334 "bdev_name": "crypto_ram" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd1", 00:29:56.334 "bdev_name": "crypto_ram1" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd10", 00:29:56.334 "bdev_name": "crypto_ram2" 00:29:56.334 }, 00:29:56.334 { 00:29:56.334 "nbd_device": "/dev/nbd11", 00:29:56.334 "bdev_name": "crypto_ram3" 00:29:56.334 } 00:29:56.334 ]' 00:29:56.334 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:56.595 /dev/nbd1 00:29:56.595 /dev/nbd10 00:29:56.595 /dev/nbd11' 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:56.595 /dev/nbd1 00:29:56.595 /dev/nbd10 00:29:56.595 /dev/nbd11' 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:56.595 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:56.595 256+0 records in 00:29:56.595 256+0 records out 00:29:56.595 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0118757 s, 88.3 MB/s 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:56.596 256+0 records in 00:29:56.596 256+0 records out 00:29:56.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0579052 s, 18.1 MB/s 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:56.596 256+0 records in 00:29:56.596 256+0 records out 00:29:56.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0466484 s, 22.5 MB/s 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:29:56.596 256+0 records in 00:29:56.596 256+0 records out 00:29:56.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0421305 s, 24.9 MB/s 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:29:56.596 256+0 records in 00:29:56.596 256+0 records out 00:29:56.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0294681 s, 35.6 MB/s 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:56.596 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:56.856 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:57.117 10:25:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:57.376 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:57.637 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:57.897 malloc_lvol_verify 00:29:57.897 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:58.158 d1d3f4cb-ccb4-4bbd-8eb5-6537a92bef1f 00:29:58.158 10:25:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:58.468 3607dea8-3137-47d1-ab15-0f93304a8651 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:58.468 /dev/nbd0 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:58.468 mke2fs 1.46.5 (30-Dec-2021) 00:29:58.468 Discarding device blocks: 0/4096 done 00:29:58.468 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:58.468 00:29:58.468 Allocating group tables: 0/1 done 00:29:58.468 Writing inode tables: 0/1 done 00:29:58.468 Creating journal (1024 blocks): done 00:29:58.468 Writing superblocks and filesystem accounting information: 0/1 done 00:29:58.468 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:58.468 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:58.750 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1175758 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1175758 ']' 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1175758 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1175758 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1175758' 00:29:58.751 killing process with pid 1175758 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1175758 00:29:58.751 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1175758 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:59.011 00:29:59.011 real 0m8.476s 00:29:59.011 user 0m11.638s 00:29:59.011 sys 0m2.361s 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:59.011 ************************************ 00:29:59.011 END TEST bdev_nbd 00:29:59.011 ************************************ 00:29:59.011 10:25:20 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:59.011 10:25:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:29:59.011 10:25:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:29:59.011 10:25:20 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:59.011 10:25:20 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:29:59.011 10:25:20 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:59.011 10:25:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:59.011 ************************************ 00:29:59.011 START TEST bdev_fio 00:29:59.011 ************************************ 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:59.011 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:59.011 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:29:59.012 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:59.273 ************************************ 00:29:59.273 START TEST bdev_fio_rw_verify 00:29:59.273 ************************************ 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:59.273 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:59.274 10:25:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:59.535 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:59.535 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:59.535 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:59.535 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:59.535 fio-3.35 00:29:59.535 Starting 4 threads 00:30:14.485 00:30:14.485 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1178000: Mon Jun 10 10:25:33 2024 00:30:14.485 read: IOPS=36.4k, BW=142MiB/s (149MB/s)(1422MiB/10001msec) 00:30:14.485 slat (usec): min=14, max=515, avg=35.75, stdev=21.11 00:30:14.485 clat (usec): min=18, max=1213, avg=215.57, stdev=142.04 00:30:14.485 lat (usec): min=35, max=1388, avg=251.32, stdev=152.87 00:30:14.485 clat percentiles (usec): 00:30:14.485 | 50.000th=[ 169], 99.000th=[ 709], 99.900th=[ 865], 99.990th=[ 947], 00:30:14.485 | 99.999th=[ 1106] 00:30:14.485 write: IOPS=39.9k, BW=156MiB/s (164MB/s)(1520MiB/9745msec); 0 zone resets 00:30:14.485 slat (usec): min=15, max=564, avg=45.16, stdev=21.00 00:30:14.485 clat (usec): min=16, max=2047, avg=245.30, stdev=145.17 00:30:14.485 lat (usec): min=41, max=2258, avg=290.46, stdev=156.11 00:30:14.485 clat percentiles (usec): 00:30:14.485 | 50.000th=[ 212], 99.000th=[ 725], 99.900th=[ 898], 99.990th=[ 1303], 00:30:14.485 | 99.999th=[ 1893] 00:30:14.485 bw ( KiB/s): min=128800, max=181848, per=98.00%, avg=156494.32, stdev=4635.19, samples=76 00:30:14.485 iops : min=32200, max=45462, avg=39123.47, stdev=1158.83, samples=76 00:30:14.485 lat (usec) : 20=0.01%, 50=0.07%, 100=15.15%, 250=49.44%, 500=29.58% 00:30:14.485 lat (usec) : 750=5.07%, 1000=0.68% 00:30:14.485 lat (msec) : 2=0.02%, 4=0.01% 00:30:14.485 cpu : usr=99.75%, sys=0.00%, ctx=56, majf=0, minf=235 00:30:14.485 IO depths : 1=0.1%, 2=28.6%, 4=57.1%, 8=14.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:14.485 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:14.485 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:14.485 issued rwts: total=364116,389021,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:14.485 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:14.485 00:30:14.485 Run status group 0 (all jobs): 00:30:14.485 READ: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1422MiB (1491MB), run=10001-10001msec 00:30:14.485 WRITE: bw=156MiB/s (164MB/s), 156MiB/s-156MiB/s (164MB/s-164MB/s), io=1520MiB (1593MB), run=9745-9745msec 00:30:14.485 00:30:14.485 real 0m13.256s 00:30:14.485 user 0m48.361s 00:30:14.485 sys 0m0.371s 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:14.485 ************************************ 00:30:14.485 END TEST bdev_fio_rw_verify 00:30:14.485 ************************************ 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:30:14.485 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0026a223-3ea6-573a-9fe4-b432517699d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0026a223-3ea6-573a-9fe4-b432517699d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0601ff36-398d-5d5a-a510-7802b96ea5a1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0601ff36-398d-5d5a-a510-7802b96ea5a1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f73e9eb5-7de2-5655-bfa9-9bd82616aac1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f73e9eb5-7de2-5655-bfa9-9bd82616aac1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:14.486 crypto_ram1 00:30:14.486 crypto_ram2 00:30:14.486 crypto_ram3 ]] 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0026a223-3ea6-573a-9fe4-b432517699d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0026a223-3ea6-573a-9fe4-b432517699d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0601ff36-398d-5d5a-a510-7802b96ea5a1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0601ff36-398d-5d5a-a510-7802b96ea5a1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4d8a250d-1085-5ec5-b1b5-d9c695c8f0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f73e9eb5-7de2-5655-bfa9-9bd82616aac1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f73e9eb5-7de2-5655-bfa9-9bd82616aac1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:14.486 ************************************ 00:30:14.486 START TEST bdev_fio_trim 00:30:14.486 ************************************ 00:30:14.486 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:14.487 10:25:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.487 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.487 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.487 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.487 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.487 fio-3.35 00:30:14.487 Starting 4 threads 00:30:26.728 00:30:26.728 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1180530: Mon Jun 10 10:25:47 2024 00:30:26.728 write: IOPS=73.7k, BW=288MiB/s (302MB/s)(2880MiB/10001msec); 0 zone resets 00:30:26.728 slat (usec): min=14, max=567, avg=33.51, stdev=18.84 00:30:26.728 clat (usec): min=16, max=977, avg=114.37, stdev=62.03 00:30:26.728 lat (usec): min=31, max=1192, avg=147.89, stdev=71.47 00:30:26.728 clat percentiles (usec): 00:30:26.728 | 50.000th=[ 103], 99.000th=[ 306], 99.900th=[ 404], 99.990th=[ 523], 00:30:26.728 | 99.999th=[ 922] 00:30:26.728 bw ( KiB/s): min=239520, max=307776, per=100.00%, avg=297985.68, stdev=4816.14, samples=76 00:30:26.728 iops : min=59880, max=76944, avg=74496.42, stdev=1204.04, samples=76 00:30:26.728 trim: IOPS=73.7k, BW=288MiB/s (302MB/s)(2880MiB/10001msec); 0 zone resets 00:30:26.728 slat (nsec): min=3805, max=45030, avg=6890.06, stdev=2984.26 00:30:26.728 clat (usec): min=20, max=1193, avg=148.05, stdev=71.48 00:30:26.728 lat (usec): min=26, max=1218, avg=154.94, stdev=71.83 00:30:26.728 clat percentiles (usec): 00:30:26.728 | 50.000th=[ 133], 99.000th=[ 367], 99.900th=[ 482], 99.990th=[ 644], 00:30:26.728 | 99.999th=[ 1123] 00:30:26.728 bw ( KiB/s): min=239520, max=307776, per=100.00%, avg=297985.68, stdev=4816.14, samples=76 00:30:26.728 iops : min=59880, max=76944, avg=74496.42, stdev=1204.04, samples=76 00:30:26.728 lat (usec) : 20=0.01%, 50=6.65%, 100=31.17%, 250=55.38%, 500=6.76% 00:30:26.728 lat (usec) : 750=0.04%, 1000=0.01% 00:30:26.728 lat (msec) : 2=0.01% 00:30:26.728 cpu : usr=99.77%, sys=0.00%, ctx=53, majf=0, minf=84 00:30:26.728 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:26.728 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:26.728 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:26.728 issued rwts: total=0,737253,737254,0 short=0,0,0,0 dropped=0,0,0,0 00:30:26.728 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:26.728 00:30:26.728 Run status group 0 (all jobs): 00:30:26.728 WRITE: bw=288MiB/s (302MB/s), 288MiB/s-288MiB/s (302MB/s-302MB/s), io=2880MiB (3020MB), run=10001-10001msec 00:30:26.728 TRIM: bw=288MiB/s (302MB/s), 288MiB/s-288MiB/s (302MB/s-302MB/s), io=2880MiB (3020MB), run=10001-10001msec 00:30:26.728 00:30:26.728 real 0m13.378s 00:30:26.728 user 0m50.673s 00:30:26.728 sys 0m0.411s 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:26.728 ************************************ 00:30:26.728 END TEST bdev_fio_trim 00:30:26.728 ************************************ 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:26.728 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:26.728 00:30:26.728 real 0m26.966s 00:30:26.728 user 1m39.223s 00:30:26.728 sys 0m0.941s 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:26.728 ************************************ 00:30:26.728 END TEST bdev_fio 00:30:26.728 ************************************ 00:30:26.728 10:25:47 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:26.728 10:25:47 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:26.728 10:25:47 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:30:26.728 10:25:47 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:26.728 10:25:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:26.728 ************************************ 00:30:26.728 START TEST bdev_verify 00:30:26.728 ************************************ 00:30:26.728 10:25:47 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:26.728 [2024-06-10 10:25:47.936043] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:26.728 [2024-06-10 10:25:47.936100] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1182117 ] 00:30:26.728 [2024-06-10 10:25:48.025919] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:26.728 [2024-06-10 10:25:48.109909] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:26.728 [2024-06-10 10:25:48.109952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:26.728 [2024-06-10 10:25:48.131029] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:26.728 [2024-06-10 10:25:48.139056] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:26.728 [2024-06-10 10:25:48.147074] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:26.728 [2024-06-10 10:25:48.240435] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:28.641 [2024-06-10 10:25:50.398425] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:28.641 [2024-06-10 10:25:50.398488] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:28.641 [2024-06-10 10:25:50.398496] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:28.641 [2024-06-10 10:25:50.406442] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:28.641 [2024-06-10 10:25:50.406453] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:28.641 [2024-06-10 10:25:50.406459] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:28.641 [2024-06-10 10:25:50.414462] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:28.641 [2024-06-10 10:25:50.414472] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:28.641 [2024-06-10 10:25:50.414477] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:28.641 [2024-06-10 10:25:50.422483] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:28.641 [2024-06-10 10:25:50.422493] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:28.641 [2024-06-10 10:25:50.422498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:28.641 Running I/O for 5 seconds... 00:30:33.927 00:30:33.927 Latency(us) 00:30:33.927 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:33.927 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:33.927 Verification LBA range: start 0x0 length 0x1000 00:30:33.927 crypto_ram : 5.05 615.82 2.41 0.00 0.00 206872.94 2344.17 139541.27 00:30:33.927 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:33.927 Verification LBA range: start 0x1000 length 0x1000 00:30:33.928 crypto_ram : 5.05 617.82 2.41 0.00 0.00 206297.62 2457.60 139541.27 00:30:33.928 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x0 length 0x1000 00:30:33.928 crypto_ram1 : 5.06 620.35 2.42 0.00 0.00 205094.44 2041.70 128248.91 00:30:33.928 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x1000 length 0x1000 00:30:33.928 crypto_ram1 : 5.05 622.33 2.43 0.00 0.00 204474.99 2445.00 127442.31 00:30:33.928 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x0 length 0x1000 00:30:33.928 crypto_ram2 : 5.04 4852.44 18.95 0.00 0.00 26159.88 4965.61 22887.19 00:30:33.928 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x1000 length 0x1000 00:30:33.928 crypto_ram2 : 5.03 4856.55 18.97 0.00 0.00 26141.39 5999.06 23290.49 00:30:33.928 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x0 length 0x1000 00:30:33.928 crypto_ram3 : 5.05 4869.99 19.02 0.00 0.00 26010.54 1380.04 23391.31 00:30:33.928 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:33.928 Verification LBA range: start 0x1000 length 0x1000 00:30:33.928 crypto_ram3 : 5.04 4874.25 19.04 0.00 0.00 25985.82 1436.75 22685.54 00:30:33.928 =================================================================================================================== 00:30:33.928 Total : 21929.56 85.66 0.00 0.00 46400.20 1380.04 139541.27 00:30:34.189 00:30:34.189 real 0m7.944s 00:30:34.189 user 0m15.260s 00:30:34.189 sys 0m0.252s 00:30:34.189 10:25:55 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:34.189 10:25:55 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:30:34.189 ************************************ 00:30:34.189 END TEST bdev_verify 00:30:34.189 ************************************ 00:30:34.189 10:25:55 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:34.189 10:25:55 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:30:34.189 10:25:55 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:34.189 10:25:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:34.189 ************************************ 00:30:34.189 START TEST bdev_verify_big_io 00:30:34.189 ************************************ 00:30:34.189 10:25:55 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:34.189 [2024-06-10 10:25:55.949294] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:34.189 [2024-06-10 10:25:55.949340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1183522 ] 00:30:34.189 [2024-06-10 10:25:56.016068] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:34.450 [2024-06-10 10:25:56.079672] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:34.450 [2024-06-10 10:25:56.079677] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:34.450 [2024-06-10 10:25:56.100728] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:34.450 [2024-06-10 10:25:56.108754] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:34.450 [2024-06-10 10:25:56.116773] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:34.450 [2024-06-10 10:25:56.202207] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:36.996 [2024-06-10 10:25:58.352406] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:36.996 [2024-06-10 10:25:58.352468] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:36.996 [2024-06-10 10:25:58.352480] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.996 [2024-06-10 10:25:58.360424] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:36.996 [2024-06-10 10:25:58.360436] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:36.996 [2024-06-10 10:25:58.360442] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.996 [2024-06-10 10:25:58.368446] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:36.996 [2024-06-10 10:25:58.368458] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:36.996 [2024-06-10 10:25:58.368464] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.996 [2024-06-10 10:25:58.376466] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:36.996 [2024-06-10 10:25:58.376477] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:36.996 [2024-06-10 10:25:58.376482] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.996 Running I/O for 5 seconds... 00:30:37.571 [2024-06-10 10:25:59.178517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.178853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.178903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.178938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.178970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.179000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.179306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.179315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.182940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.183255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.183264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.187751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.188196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.188205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.192889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.193191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.193200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.195936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.195967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.195997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.196948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.199811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.199846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.199878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.199907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.200723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.203978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.204008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.204302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.204311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.206932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.206963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.206992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.207021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.571 [2024-06-10 10:25:59.207335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.207367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.207397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.207426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.207728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.207736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.211891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.214743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.214775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.214804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.214836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.215701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.218773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.219079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.219087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.221965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.221998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.222934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.225768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.226077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.226085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.228913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.229272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.229281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.232705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.233011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.233019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.236745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.237058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.237069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.239895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.239928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.239976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.240009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.240517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.240549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.240593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.572 [2024-06-10 10:25:59.240623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.240952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.240961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.243695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.243726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.243756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.243787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.244719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.247943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.250950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.253981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.254257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.254266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.257628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.258127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.258136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.260714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.260746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.260775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.260806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.261510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.263981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.264015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.264316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.264324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.266855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.266887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.266916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.266946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.267634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.573 [2024-06-10 10:25:59.269924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.272872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.274875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.275092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.275100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.277936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.279881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.280102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.280111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.282975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.284865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.285086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.285094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.287109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.288034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.289325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.290900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.291772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.293052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.294630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.296213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.296507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.296519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.300522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.302097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.303670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.304512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.306026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.307609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.309031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.309329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.309625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.309633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.313151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.314721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.315353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.316637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.318384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.319537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.319840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.320138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.320439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.320448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.323537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.574 [2024-06-10 10:25:59.324158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.325447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.327002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.328403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.328702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.329002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.329300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.329594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.329606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.331856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.333205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.334790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.336361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.336928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.337229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.337525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.337825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.338120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.338129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.340960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.342544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.344111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.344931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.345612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.345915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.346214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.346957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.347250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.347258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.350121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.351699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.352763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.353074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.353682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.353986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.354571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.355857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.356078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.356086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.358960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.360265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.360567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.360870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.361454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.361788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.363075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.364658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.364885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.364893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.367714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.368018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.368317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.368614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.369225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.370541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.372103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.373668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.373932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.373941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.375568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.375874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.376174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.376474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.378201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.379737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.381281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.382315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.382539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.382547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.384371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.384672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.575 [2024-06-10 10:25:59.384974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.385289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.386804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.388265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.389595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.390875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.391105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.391114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.393147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.393446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.393747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.395386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.397141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.398699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.399354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.400613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.400834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.400843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.403190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.403492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.404643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.405931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.407731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.408354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.409625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.411173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.411395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.411404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.413539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.414746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.416035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.417566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.418388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.419669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.421213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.422777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.423032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.423042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.426889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.428437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.429989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.431002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.576 [2024-06-10 10:25:59.432490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.434046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.435625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.435926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.436308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.436317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.439730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.441288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.441987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.443337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.445066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.446443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.446742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.447044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.447356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.447365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.450455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.451099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.452382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.453950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.455492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.455792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.456092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.456389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.456707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.456715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.458976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.460474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.462015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.463584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.464111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.464413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.464712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.465012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.465303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.465311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.467989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.469416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.470989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.841 [2024-06-10 10:25:59.472200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.472957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.473257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.473554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.474345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.474601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.474610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.477245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.478814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.480218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.480518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.481122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.481421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.482026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.483312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.483536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.483544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.486480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.487995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.488294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.488592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.489180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.489562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.490840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.492399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.492621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.492630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.495576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.495898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.496196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.496495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.497122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.498561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.500095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.501667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.501917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.501927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.503839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.504140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.504437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.504738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.506498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.508007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.509568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.510602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.510829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.510838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.512474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.512773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.513078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.513376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.514890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.516392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.517752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.519021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.519279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.519288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.521078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.521400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.521699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.523169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.524958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.526419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.527599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.528879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.529101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.529109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.531675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.531979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.533279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.534765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.535412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.536666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.538116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.539267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.539715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.539723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.542153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.542453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.542751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.543052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.543651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.543954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.544253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.544551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.544845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.544854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.546907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.547207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.547508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.547806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.548468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.548770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.842 [2024-06-10 10:25:59.549072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.549373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.549680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.549690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.551899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.552200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.552498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.552794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.553467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.553767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.554068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.554365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.554681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.554689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.557329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.557630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.557933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.558231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.558806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.559109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.559408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.559705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.560101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.560110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.562150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.562453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.562751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.562775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.563466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.563777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.564103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.564401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.564717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.564725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.567680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.567984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.568302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.568598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.568634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.568932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.569235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.569533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.569834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.570135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.570465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.570473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.572814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.573098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.573107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.574923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.574954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.574984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.575841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.577630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.577664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.577694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.577725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.578459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.580809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.581134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.581142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.582950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.582982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.583011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.583040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.843 [2024-06-10 10:25:59.583409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.583866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.585964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.586280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.586289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.588912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.589238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.589246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.591738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.592022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.592031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.594993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.595002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.596830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.596862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.596891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.596921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.597865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.599650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.599682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.599713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.599751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.600651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.602838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.603139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.603148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.604991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.844 [2024-06-10 10:25:59.605919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.605928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.608761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.609046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.609054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.611791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.612160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.612168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.613887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.613919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.613952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.613981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.614814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.616999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.617028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.617057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.617432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.617442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.619858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.621969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.622001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.622218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.622227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.623617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.623649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.623681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.623710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.623980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.624639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.626785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.627009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.845 [2024-06-10 10:25:59.627018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.628770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.629052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.629062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.631923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.633897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.635724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.635757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.635787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.635818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.636428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.637861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.637894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.637923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.637951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.638517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.640980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.846 [2024-06-10 10:25:59.642731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.642952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.642961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.645990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.647894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.649447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.649480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.649778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.649808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.650585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.651884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.651915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.651944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.653968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.655885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.656850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.658128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.659693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.659917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.660685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.662061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.663612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.665188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.665441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.665451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.669256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.670703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.672252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.673430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.673687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.674972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.676546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.678133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.678432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.678814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.678828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.681924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.683489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.684364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.685829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.686050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.687642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.689093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.689391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.689687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.689984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.689993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.693297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.694188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.695655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.697223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.697441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.698878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.699177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.699473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.699770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.700085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.700094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:37.847 [2024-06-10 10:25:59.702455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.703942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.705450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.707010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.707233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.707534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.707835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.708134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.708433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.708710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.708718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.710886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.712174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.713747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.715276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.715567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.715871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.111 [2024-06-10 10:25:59.716169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.716466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.717005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.717225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.717233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.719768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.721336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.722904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.723364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.723769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.724074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.724391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.724689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.726106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.726324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.726332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.729196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.730765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.731516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.731838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.732132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.732434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.732732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.734195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.735650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.735874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.735882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.738740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.739716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.740021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.740319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.740673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.740978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.742270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.743588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.745059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.745278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.745286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.747734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.748039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.748336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.748634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.748955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.750065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.751358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.752940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.754404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.754665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.754673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.756287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.756587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.756888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.757186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.757445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.758716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.760289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.761848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.762501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.762722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.762730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.764351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.764655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.764956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.765802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.766057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.767620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.769188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.769997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.771403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.771622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.771630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.773657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.773960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.774619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.775892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.776111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.777697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.778747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.780217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.781676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.781899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.781907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.783684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.784420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.785691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.787255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.787475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.788452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.789918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.791453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.793005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.793225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.112 [2024-06-10 10:25:59.793233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.796615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.797903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.799486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.800942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.801188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.802478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.804050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.805616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.806052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.806479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.806487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.809425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.810992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.812070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.813527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.813750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.815297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.816749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.817049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.817348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.817642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.817651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.820872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.821804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.823264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.824843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.825062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.826494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.826793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.827093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.827390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.827705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.827714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.830061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.831508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.832976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.834542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.834768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.835072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.835370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.835667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.835967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.836234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.836244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.838668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.839956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.841456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.842841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.843174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.843476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.843775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.844075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.844969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.845216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.845224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.847783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.849342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.850861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.851160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.851536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.851839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.852138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.852715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.854001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.854221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.854230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.857054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.858629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.859015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.859314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.859606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.859928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.860226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.861631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.863196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.863423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.863432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.866295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.867030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.867334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.867632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.868028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.868329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.869780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.871237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.872796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.873020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.873028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.875341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.875641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.875941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.876238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.113 [2024-06-10 10:25:59.876538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.877386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.878993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.880580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.880879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.881097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.881109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.882954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.883276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.883575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.883886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.884202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.884503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.884802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.885102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.885403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.885661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.885670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.887683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.887986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.888283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.888596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.888919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.889222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.889521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.889818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.890119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.890521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.890529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.892653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.892955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.893253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.893551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.893867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.894169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.894469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.894770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.895069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.895359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.895367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.897503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.897803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.898147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.898482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.898763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.899067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.899366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.899663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.899964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.900273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.900282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.902382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.902684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.902984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.903281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.903641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.903944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.904245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.904542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.904849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.905202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.905210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.907513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.907824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.908127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.908424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.908728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.909053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.909356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.909653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.909952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.910393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.910403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.912437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.912737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.913037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.913334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.913640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.913946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.914244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.914543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.914844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.915191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.915200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.917059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.917358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.917388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.917685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.918041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.918344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.918642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.918941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.919241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.919536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.114 [2024-06-10 10:25:59.919544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.921588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.921890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.922190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.922238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.922584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.922906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.923206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.923503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.923805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.924117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.924126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.926704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.926735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.926764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.926792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.927551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.929772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.930091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.930099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.931860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.931891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.931921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.931949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.932738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.934602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.934633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.934662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.934691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.934988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.935467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.937865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.938143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.938154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.939797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.939832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.939861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.939904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.940633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.943697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.944039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.944048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.115 [2024-06-10 10:25:59.946920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.946952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.946981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.947010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.947277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.947285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.949699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.950073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.950082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.951785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.951817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.951849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.951900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.952694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.954491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.954523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.954554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.954582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.954977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.955353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.957916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.958239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.958247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.960914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.961131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.961139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.962860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.963075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.963085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.964754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.964785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.964814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.964846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.965105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.116 [2024-06-10 10:25:59.965138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.965167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.965196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.965225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.965478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.965486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.966774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.966805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.966839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.966875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.967444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.969828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.970056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.970063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.971848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.972067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.972075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.973508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.973540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.973570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.117 [2024-06-10 10:25:59.973599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.973873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.973908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.973937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.973974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.974005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.974330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.974339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.975587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.975618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.975649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.975677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.975971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.976359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.977668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.977699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.977728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.977756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.978493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.979905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.979936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.979965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.979993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.980635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.981865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.981897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.981925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.981954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.982689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.984993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.985022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.985240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.985249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.986565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.986596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.986625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.986656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.381 [2024-06-10 10:25:59.987058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.987589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.989886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.991667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.992082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.992090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.995986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.997935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.999749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.999780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.999809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:25:59.999842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.000503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.002182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.002219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.003867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.004086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.004094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.005900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.005931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.005960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.007725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.010560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.011797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.012100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.012399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.012752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.013056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.013929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.015202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.016764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.016988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.016997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.019840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.020141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.020438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.020735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.021092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.021613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.022885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.024431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.025977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.026260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.026269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.028110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.028414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.028714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.029019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.029328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.030773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.032190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.033779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.034992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.035263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.035272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.036834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.037133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.037433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.037734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.037967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.039254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.040825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.042395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.043258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.043510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.043519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.045204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.045506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.045804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.047046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.047317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.048885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.050444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.051072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.052345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.052566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.052574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.054667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.054972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.055938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.057218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.057440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.382 [2024-06-10 10:26:00.059031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.059775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.061149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.062707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.062931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.062939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.064772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.065802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.067084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.068637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.068862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.069579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.070935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.072502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.074054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.074297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.074309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.077907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.079245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.080720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.082001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.082262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.083545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.085116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.086695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.086999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.087354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.087363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.090589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.092159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.093110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.094567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.094793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.096394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.097842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.098141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.098440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.098763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.098774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.101947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.102748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.104174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.105734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.105960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.107391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.107691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.107992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.108291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.108590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.108600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.110953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.112428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.113909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.115474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.115698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.116003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.116301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.116598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.116897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.117168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.117177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.119594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.120891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.122391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.123750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.124047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.124350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.124647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.124950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.125623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.125869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.125878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.128506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.130065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.131656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.131957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.132337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.132640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.132940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.133246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.134548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.134769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.134777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.137652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.139209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.139906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.140204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.140513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.140815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.141117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.142467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.143860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.144082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.144090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.146992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.147958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.148257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.148554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.148843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.149149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.150431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.151743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.153204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.153425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.153433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.156073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.156376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.156673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.156972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.157259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.158114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.159387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.160959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.162548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.162909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.162918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.164474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.164774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.165074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.165372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.165633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.166911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.168475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.170041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.170671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.170895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.170903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.172479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.172779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.173081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.173440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.173659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.175012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.176618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.177823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.179250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.179511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.179520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.181546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.181850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.182330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.183593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.183815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.185348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.186571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.187958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.189391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.383 [2024-06-10 10:26:00.189611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.189619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.191946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.192261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.193612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.195044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.195268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.196008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.197282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.198718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.199607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.200027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.200035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.203206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.204772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.205999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.206943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.207291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.208891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.210457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.211225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.211524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.211829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.211838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.213989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.214290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.214588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.214888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.215191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.215493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.215791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.216093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.216390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.216831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.216841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.219095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.219396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.219694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.220004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.220312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.220615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.220915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.221217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.221515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.221854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.221863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.223998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.224301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.224600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.224902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.225229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.225532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.225834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.226132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.226433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.226730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.226739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.228533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.228837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.229136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.229434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.229899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.230203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.230502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.230801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.231101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.231420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.231429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.234154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.234454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.234754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.235055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.235405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.235710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.236016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.236313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.236611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.236916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.236928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.238907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.239208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.239506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.239804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.240076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.240378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.240675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.240975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.241274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.241616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.384 [2024-06-10 10:26:00.241625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.243724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.244034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.244336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.244635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.244966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.245269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.245566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.245866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.246168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.246497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.246506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.248615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.248921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.249222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.249519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.249828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.250131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.250428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.250726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.251025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.251304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.251313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.253654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.253955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.254253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.254551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.254846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.255149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.255446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.255743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.256043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.256362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.256371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.258608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.258910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.258940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.259236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.259504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.259806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.260107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.260405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.260703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.260988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.260996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.263038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.263340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.263648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.263683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.264024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.264326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.264624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.264925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.265227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.265494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.265503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.267973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.268003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.268032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.268303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.268311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.270985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.271014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.271043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.271333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.271344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.273678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.274193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.274202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.648 [2024-06-10 10:26:00.277315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.277892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.278333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.278341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.280769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.280809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.280841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.280870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.281677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.283982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.284205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.284214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.286900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.288775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.288808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.288843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.288873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.289503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.290905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.290937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.290967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.290996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.291573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.293916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.294136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.294144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.295982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.296199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.296207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.297955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.297986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.298865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.300901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.302931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.303298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.303307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.304663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.304694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.304723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.304752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.305350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.306644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.306675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.306706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.306734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.307524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.649 [2024-06-10 10:26:00.308979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.309691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.310946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.310977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.311880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.313965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.315855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.316225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.316234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.317933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.317964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.318602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.319974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.320905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.323638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.323669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.323698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.323726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.323983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.324330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.325987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.326017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.326047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.326266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.326283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.328838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.330846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.332665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.332697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.332726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.332755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.333410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.334697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.334728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.334760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.334789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.335416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.337740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.650 [2024-06-10 10:26:00.338039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.338047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.339974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.341564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.341595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.341897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.341928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.342634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.344002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.344034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.344063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.345814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.346037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.346045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.347841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.348723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.349995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.351565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.351787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.352634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.354044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.355591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.357159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.357379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.357387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.360674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.361981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.363491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.364886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.365138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.366436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.368004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.369573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.369925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.370381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.370390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.373481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.375056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.376048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.377502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.377721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.379306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.380768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.381069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.381368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.381637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.381645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.384652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.385604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.387068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.388652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.388874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.390300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.390599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.390900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.391198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.391576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.391585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.393919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.395380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.396869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.398430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.398653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.398957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.399255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.399553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.399855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.400135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.400144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.402644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.403948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.405409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.406755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.407030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.407332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.407630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.407933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.408699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.408948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.408958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.411537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.413122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.414581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.414882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.415193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.415494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.415794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.416334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.417609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.417833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.417842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.420717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.422301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.422600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.422901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.423211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.423519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.423817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.425228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.426790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.427013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.651 [2024-06-10 10:26:00.427021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.430009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.430577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.430878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.431175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.431531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.431835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.433293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.434763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.436320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.436545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.436553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.438698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.439020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.439319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.439634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.439938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.441240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.442565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.444023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.445349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.445577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.445593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.447246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.447546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.447846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.448144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.448375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.449659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.451197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.452782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.453682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.453938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.453948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.455607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.455913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.456212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.457524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.457834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.459406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.460987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.461668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.462936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.463157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.463166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.464924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.465225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.466433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.467709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.467932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.469518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.470160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.471428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.472989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.473210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.473219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.475366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.476598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.477887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.479381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.479603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.480255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.481530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.483094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.484660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.484975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.484984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.488528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.490072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.491639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.492612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.492840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.494119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.495665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.497162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.497460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.497750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.497758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.500968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.502521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.503260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.504639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.504865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.506360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.507709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.508011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.508311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.508621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.652 [2024-06-10 10:26:00.508630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.511601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.512327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.513693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.515252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.515473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.516867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.517166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.517464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.517760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.518094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.518102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.520260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.521825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.523410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.523709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.524012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.524314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.524613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.524914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.526278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.526496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.526504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.529545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.531113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.531978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.532275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.916 [2024-06-10 10:26:00.532574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.532878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.533177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.533473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.533771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.534097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.534105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.536273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.536578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.536881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.537178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.537515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.537816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.538117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.538416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.538719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.539035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.539047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.541394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.541696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.541998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.542299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.542671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.542975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.543273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.543569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.543869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.544185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.544193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.546263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.546564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.546865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.547169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.547560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.547873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.548172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.548485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.548781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.549128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.549136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.551116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.551417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.551715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.552017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.552315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.552616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.552925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.553222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.553524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.553876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.553884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.556356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.556660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.556960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.557257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.557667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.557972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.558271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.558568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.558875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.559171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.559179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.561230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.561530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.561830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.562129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.562459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.562774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.563078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.563374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.563672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.564089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.564098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.566101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.566420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.566718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.567030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.567344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.568547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.569010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.569307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.570627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.570960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.570969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.572987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.573288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.574653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.574953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.575261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.575562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.575864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.576979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.577514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.917 [2024-06-10 10:26:00.577816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.577829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.579813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.580827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.581502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.581800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.582053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.583079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.583377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.583683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.583987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.584307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.584315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.586260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.586563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.586870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.587420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.587640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.587944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.588242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.589690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.589992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.590307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.590316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.592213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.593644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.593945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.594242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.594605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.594917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.596420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.596717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.597016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.597234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.597242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.601113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.601416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.601446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.601861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.602082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.602383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.602681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.602997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.603303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.603519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.603528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.605879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.606184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.606481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.606512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.606734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.607317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.607616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.608839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.609272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.609575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.609585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.611886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.614721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.615021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.615030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.616978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.617008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.617036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.617065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.617286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.617294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.618748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.618779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.918 [2024-06-10 10:26:00.618808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.618840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.619524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.621797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.623944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.624245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.624253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.626979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.627008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.627037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.627066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.627336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.627344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.628739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.628770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.628802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.628832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.629492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.630779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.630810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.630841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.630871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.631738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.633839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.635809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.636214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.636223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.637897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.637928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.637977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.919 [2024-06-10 10:26:00.638342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.638557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.638574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.639931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.639962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.639991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.640675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.642778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.642809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.642840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.642869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.643447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.644812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.644846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.644875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.644904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.645480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.647818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.648038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.648046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.652986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.656813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.659957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.659991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.660714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.663913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.663946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.663977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.664605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.666798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.666833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.920 [2024-06-10 10:26:00.666862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.666891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.667458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.671970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.672340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.672348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.675786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.675819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.675852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.675880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.676445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.679963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.679996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.680628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.685798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.686113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.686122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.689856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.693977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.694005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.694274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.694282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.698455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.698493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.698525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.921 [2024-06-10 10:26:00.698553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.698901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.698935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.698965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.698995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.699023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.699445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.699453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.702982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.703670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.707774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.707813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.707844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.707873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.708651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.712236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.712269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.713733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.713764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.713985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.714418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.716184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.716215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.716244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.717662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.717886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.717919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.717948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.717978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.718008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.718228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.718236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.720606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.720909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.721209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.721506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.721816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.723150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.724527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.726031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.727307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.727563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.727572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.729200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.729500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.729802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.730102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.730320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.731628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.733214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.734672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.735783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.736063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.736072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.737759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.738063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.738360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.739838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.740060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.741653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.743105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.744307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.745614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.745835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.745843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.747946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.748431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.749720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.751287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.751506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.752713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.754159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.755605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.757172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.922 [2024-06-10 10:26:00.757391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.757399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.760825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.762134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.763627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.764993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.765232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.766522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.767863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.769440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.770061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.770430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.770438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.773436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.775011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.776042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.777495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:38.923 [2024-06-10 10:26:00.777715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.779289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.780744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.781044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.781342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.781651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.781662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.784873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.785578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.786936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.788505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.788725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.790098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.790397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.790695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.790994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.791321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.791332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.793450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.794866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.796442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.797996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.798227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.798529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.798830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.799134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.799435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.799685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.799694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.802450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.803979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.805547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.806560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.806994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.807296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.807594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.807893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.809216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.809510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.809518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.812481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.814064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.815000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.815298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.815604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.815907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.816206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.817492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.818801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.819024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.819033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.821934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.822950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.823248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.823545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.823854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.824157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.825431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.826771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.828245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.828463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.828472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.830875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.831179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.831477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.831774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.832085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.833397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.834756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.836235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.837506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.837808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.837817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.839440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.839741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.840057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.840352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.840571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.841871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.843445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.844932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.846104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.846353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.846363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.848180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.848502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.848799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.850173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.850393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.851869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.852758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.854050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.188 [2024-06-10 10:26:00.855496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.855835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.855844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.857733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.858038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.859482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.861049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.861267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.862693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.863847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.865144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.866723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.866947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.866956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.869168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.870459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.872058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.873628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.873983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.875177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.876481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.878009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.879362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.879639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.879648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.881991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.882292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.882589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.882889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.883254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.883555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.883858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.884156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.884452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.884755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.884763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.887473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.887775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.888075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.888372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.888667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.888972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.889271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.889567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.889866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.890178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.890186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.892171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.892470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.892767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.893067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.893467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.893768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.894070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.894366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.894662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.895046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.895054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.897691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.897997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.898296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.898594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.898924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.899225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.899522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.899819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.900123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.900460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.900468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.902446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.902746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.903046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.903343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.903657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.903963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.904261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.904559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.904857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.905151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.905159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.907725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.908028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.908326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.908622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.908931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.909231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.909550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.909849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.910163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.910473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.910482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.912533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.189 [2024-06-10 10:26:00.912835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.913133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.913431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.913709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.914012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.914310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.914608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.914907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.915282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.915290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.917831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.918134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.918431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.918729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.919002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.919303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.919602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.919912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.920222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.920519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.920528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.922385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.922686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.922986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.923282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.923589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.923895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.924195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.924496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.924793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.925169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.925178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.927245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.927545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.927846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.928142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.928457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.928757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.929057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.929354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.929650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.930064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.930072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.932078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.932378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.932677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.932978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.933342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.933643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.933945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.934242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.934539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.934787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.934796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.937287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.937590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.937892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.938189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.938579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.938883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.939182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.939479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.939780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.940082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.940091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.942170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.942469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.942770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.943072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.943391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.943691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.943992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.944289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.944585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.944956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.944964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.946814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.947117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.947416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.947713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.948006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.948307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.948605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.948905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.949202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.949549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.949557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.952154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.953680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.953728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.955277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.955498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.190 [2024-06-10 10:26:00.956972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.958154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.959450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.961024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.961245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.961253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.963516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.964795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.966373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.966403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.966624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.967596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.969073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.970629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.972191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.972412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.972420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.974727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.976753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.978983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.979012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.979041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.979293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.979301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.980991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.981020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.981049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.981078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.981295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.981303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.983891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.984146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.984154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.985927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.986147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.986155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.987613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.987645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.987674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.987703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.987986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.191 [2024-06-10 10:26:00.988413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.989681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.989712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.989744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.989776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.990463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.991820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.991854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.991883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.991912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.992805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.994924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.996931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.997234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.997242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:00.999929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.001652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.002019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.002028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.003792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.003826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.003855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.003883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.004100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.004132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.004161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.004190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.192 [2024-06-10 10:26:01.004218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.004433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.004441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.005801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.005834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.005865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.005894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.006552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.008916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.009135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.009142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.010900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.011162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.011171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.012983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.013648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.015747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.017990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.018019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.018275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.018283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.019996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.020025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.020054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.020270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.020277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.021840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.021871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.021902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.021931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.022627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.023879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.023910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.023939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.193 [2024-06-10 10:26:01.023968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.024536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.026724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.027025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.027034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.028852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.029087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.029095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.030962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.031271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.031279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.032796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.032829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.032858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.032887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.033555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.034797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.034830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.035253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.035283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.035312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.194 [2024-06-10 10:26:01.035610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.056321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.056369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.057813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.057851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.058128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.460 [2024-06-10 10:26:01.058164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.058440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.058782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.058790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.058797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.067865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.068169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.068465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.068782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.068791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.071120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.072577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.074050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.075632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.076149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.076448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.076744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.077045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.077302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.077310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.079647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.080947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.082501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.083916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.084515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.084812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.085112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.085929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.086184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.086194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.088760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.090334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.091851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.092149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.092756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.093058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.093608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.094886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.095106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.095114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.098053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.099622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.100108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.100406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.101014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.101315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.102786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.104336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.104558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.104567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.107482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.108347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.108666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.108966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.109661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.111025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.112298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.113771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.113993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.114001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.116600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.116908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.117207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.117505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.118774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.120058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.121634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.123147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.123414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.123423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.124944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.125244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.125543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.125846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.127406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.128980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.130538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.131192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.131433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.131443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.133089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.133390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.133687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.134618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.136467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.138037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.138677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.139957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.140177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.140185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.142074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.461 [2024-06-10 10:26:01.142374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.143094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.144383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.146193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.147082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.148516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.150094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.150314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.150322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.152248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.152552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.154130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.155713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.156786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.158357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.159931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.160232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.160629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.160638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.163859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.165444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.166140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.167451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.169163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.170470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.170767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.171066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.171356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.171365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.174473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.175131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.176548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.178074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.179704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.180006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.180303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.180599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.180925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.180934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.182831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.183131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.183430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.183726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.184300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.184600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.184900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.185196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.185458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.185470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.187468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.187771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.188089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.188388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.188976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.189275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.189572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.189875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.190245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.190253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.192085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.192387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.192685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.192985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.193573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.193875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.194173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.194469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.194759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.194767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.196612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.196915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.197213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.197509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.198105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.198404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.198701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.199000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.199308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.199317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.201505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.201806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.202107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.202404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.203020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.203320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.203616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.203915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.204236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.204245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.206502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.206806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.207106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.207403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.462 [2024-06-10 10:26:01.208075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.208373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.208670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.208971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.209280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.209289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.211406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.211705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.212007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.212304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.212902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.213202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.213499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.213794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.214187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.214198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.216106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.216406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.216702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.217001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.217580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.217882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.218179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.218474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.218739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.218747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.221068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.221368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.221666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.221967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.222634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.222936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.223233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.223534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.223980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.223988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.226024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.226330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.226627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.226926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.227599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.227901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.228199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.228500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.228809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.228817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.231094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.231397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.231695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.231995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.232574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.232877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.233173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.233469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.233809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.233817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.235996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.236296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.236598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.236902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.237514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.237813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.238113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.238410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.238738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.238747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.240921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.242044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.242503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.243935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.244535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.244837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.245133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.245429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.245890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.245898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.248130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.248437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.248737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.249038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.249645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.249947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.250244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.250570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.250787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.250796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.253226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.254651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.256145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.257716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.259029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.260316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.463 [2024-06-10 10:26:01.261887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.263289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.263594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.263603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.266857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.266889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.268461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.269999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.271599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.273186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.274753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.275339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.275557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.275566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.278035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.279459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.281025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.281055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.282541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.283968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.285385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.286954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.287176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.287185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.290569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.290602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.291915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.291947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.293747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.293780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.294826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.294856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.295100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.295109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.296817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.296853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.297149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.297188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.297974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.298006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.298036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.298331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.298549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.298557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.299885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.299918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.299949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.299978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.300585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.302935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.303246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.303255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.304980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.305009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.305277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.305285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.306551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.306583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.306611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.306643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.307434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.464 [2024-06-10 10:26:01.309465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.309681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.309689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.310980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.311715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.313868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.314083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.314091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.315784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.316005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.316013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.318756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.320744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.465 [2024-06-10 10:26:01.322877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.324978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.325289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.325297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.326994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.327339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.327349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.328939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.328970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.329581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.330974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.331619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.792 [2024-06-10 10:26:01.333869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.333881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.335820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.337925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.339742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.340019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.340027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.341314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.341345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.341374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.341402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.341990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.342022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.342050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.342079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.342380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.342391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.344721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.346824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.348979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.349008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.349241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.349249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.350966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.351184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.351192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.352670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.352702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.352730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.352759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.353476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.354708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.793 [2024-06-10 10:26:01.354739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.354767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.354796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.355626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.356914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.356947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.356976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.357799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.359809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.360035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.360044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.362897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.362930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.362959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.362987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.363728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.365333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.365385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.366949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.366980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.367555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.369485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.369519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.370840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.370865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.371217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.371644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.371675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.372782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.373108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.373117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.376642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.377938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.377969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.377998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.379568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.379787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.379820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.380122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.380418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.380448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.380737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.380746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.384099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.384947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.386408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.387952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.388174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.389603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.390663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.391305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.391601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.391868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.391878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.396396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.397843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.399394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.400962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.401180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.401482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.401783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.402082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.402379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.402647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.402656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.794 [2024-06-10 10:26:01.405089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.406391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.407897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.409268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.409523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.410560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.410863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.411732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.412588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.412883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.412892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.418111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.419692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.420898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.421198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.421498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.421798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.422099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.423144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.424427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.424646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.424655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.427462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.428963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.429913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.430634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.430949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.431722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.432632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.432935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.434055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.434281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.434289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.439918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.440222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.440520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.440818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.441137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.442050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.443327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.444906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.446485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.446852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.446861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.448806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.449692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.449995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.451017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.451252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.451553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.452465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.453760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.455306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.455528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.455536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.459349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.459655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.459957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.460892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.461196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.462746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.464319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.465090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.466518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.466738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.466746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.468375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.468834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.470034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.470334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.470637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.471924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.473358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.474425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.475889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.476111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.476119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.481603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.483173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.484733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.485443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.485662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.486958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.488461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.489872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.490932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.491280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.491288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.494145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.495452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.497030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.498472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.498714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.795 [2024-06-10 10:26:01.500002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.501572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.503128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.503613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.504030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.504040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.508787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.509619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.510893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.512486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.512709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.513595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.514784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.515269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.515566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.515789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.515798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.517924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.518225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.518537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.518839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.519138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.519438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.520672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.521085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.521381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.521600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.521608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.525013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.525315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.525613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.525915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.526181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.527176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.527483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.528402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.529190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.529495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.529503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.531366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.531669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.531970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.532269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.532515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.533727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.534031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.534713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.535680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.536003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.536011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.538533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.538839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.539138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.540387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.540726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.541035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.542165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.542743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.543043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.543337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.543345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.545579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.545884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.546183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.547229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.547602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.547907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.548813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.549609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.549910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.550266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.550274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.553096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.553700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.554760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.555062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.555342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.556586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.556887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.557184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.796 [2024-06-10 10:26:01.557481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.557897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.557907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.560487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.560867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.562137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.562436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.562732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.564221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.564521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.564817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.565121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.565490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.565499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.569259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.569562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.569950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.571191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.571558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.571862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.572161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.572458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.572755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.573069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.573078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.576471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.576774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.577131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.578392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.578721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.579026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.579324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.579621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.579921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.580232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.580241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.582767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.584189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.584488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.584787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.585083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.585384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.585691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.585991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.586287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.586596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.586605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.588373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.589505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.590112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.590410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.590674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.590977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.591275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.591572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.591874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.592220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.592228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.596453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.596755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.597057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.597356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.597661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.597965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.598264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.598560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.598863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.599170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.599178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.602380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.602680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.602981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.603309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.603623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.603926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.604228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.604524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.604824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.605078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.605087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.607398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.608633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.608978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.610426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.610803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.611107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.611405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.611701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.612001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.612500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.797 [2024-06-10 10:26:01.612509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.616279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.616583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.616884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.617188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.617522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.619049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.619352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.619680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.621010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.621350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.621361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.626323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.626625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.627341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.627371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.627617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.629182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.630737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.631594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.633100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.633320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.633328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.635058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.635418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.636718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.637018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.637308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.638596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.640153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.641701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.642530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.642752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.642761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.646387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.646422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.647594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.647629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.648006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.648607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.649888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.651438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.653002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.653264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:39.798 [2024-06-10 10:26:01.653272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.655104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.655138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.656399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.656698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.656999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.658441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.658472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.658768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.658799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.659087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.659096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.662709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.664277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.665744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.665776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.666066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.667786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.061 [2024-06-10 10:26:01.669982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.670198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.670207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.673765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.673798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.673830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.673860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.674652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.676766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.681743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.683917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.684132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.684147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.688757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.688790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.688819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.688851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.689590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.692773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.696913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.696947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.696975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.697674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.699970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.700211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.700219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.704846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.705063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.705073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.062 [2024-06-10 10:26:01.707985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.708014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.708043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.708072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.708346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.708354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.712930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.713253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.713261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.714907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.714938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.714967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.714996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.715638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.718807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.718843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.718873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.718901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.719505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.721778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.722085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.722094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.725921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.725955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.725984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.726577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.728993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.733829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.735709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.736071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.736080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.739913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.741336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.741367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.741396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.741424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.063 [2024-06-10 10:26:01.741707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.741739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.741768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.741796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.741828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.742080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.742088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.745958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.747823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.748041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.748049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.751933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.752179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.752187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.753492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.753524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.753552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.753582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.754481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.758968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.759004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.759220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.759235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.760980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.761293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.761304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.764917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.764950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.764979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.766563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.766935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.766970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.766999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.767028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.767057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.767310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.767322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.768627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.768658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.768687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.768716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.769456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.774162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.775325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.775356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.776833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.777414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.780655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.780958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.780989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.781020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.064 [2024-06-10 10:26:01.781259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.781292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.782572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.782605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.784163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.784382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.784391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.788738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.788772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.788802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.789101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.789333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.789366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.790052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.790349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.790379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.790612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.790620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.795216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.796553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.797703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.798199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.798508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.799480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.800237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.800533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.801842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.802098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.802106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.807497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.808725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.809124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.809421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.809639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.810360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.810659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.811996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.813375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.813595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.813604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.819398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.819699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.820002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.821504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.821889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.822191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.823644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.825185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.826743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.826985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.826995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.065 [2024-06-10 10:26:01.831559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.832032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.833235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.833533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.833828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.835111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.836672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.838230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.839046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.839273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.839281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.843141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.844392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.844694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.845530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.845773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.847347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.848906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.849696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.851111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.851331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.851339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.855837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.856139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.857357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.858650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.858875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.860460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.861098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.862369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.863910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.864129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.864137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.867163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.868622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.870074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.871647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.871868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.872744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.874016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.875565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.877154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.877465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.877473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.883489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.885063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.885828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.887286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.887505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.889089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.889770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.890759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.891058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.891327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.891336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.896010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.897476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.898475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.066 [2024-06-10 10:26:01.900005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.900229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.900801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.902173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.902471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.902925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.903145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.903153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.907286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.908585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.910162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.911628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.911926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.913151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.913449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.914070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.915127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.915457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.915466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.920361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.921958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.923520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.067 [2024-06-10 10:26:01.923847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.924071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.924374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.924672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.926181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.926481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.926788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.926796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.929156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.929457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.929755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.930057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.930278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.930609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.930910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.932395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.932693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.933000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.933009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.935653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.935958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.936257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.936556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.936777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.937541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.937844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.938943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.939538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.939864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.939872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.943288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.943596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.943898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.944197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.944454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.945466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.945763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.946610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.947519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.947842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.947850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.952123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.952428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.330 [2024-06-10 10:26:01.952726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.953031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.953335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.954592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.954892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.955493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.956602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.956984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.956993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.961269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.961570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.961870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.962173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.962498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.963802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.964103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.964671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.965853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.966221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.966230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.971234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.971538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.971839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.972138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.972555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.973964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.974261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.974707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.975920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.976363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.976371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.980979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.981282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.981581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.981882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.982428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.983829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.984126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.984560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.985721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.986053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.986062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.990840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.991148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.991446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.991744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.992085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.993590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.993891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.994187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.995463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.995838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:01.995846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.000868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.001169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.001467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.001784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.002110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.003582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.003885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.004191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.005484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.005830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.005839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.011029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.011331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.011629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.011949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.012254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.013732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.014033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.014330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.015714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.016034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.016045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.020945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.021246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.021544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.021853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.022188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.023712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.024013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.024310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.025714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.026102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.026111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.031409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.031714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.032015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.032326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.032755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.034261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.331 [2024-06-10 10:26:02.034559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.034858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.036226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.036606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.036615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.042296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.042599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.042898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.043199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.043586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.045090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.045389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.045696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.047149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.047582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.047592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.052458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.052760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.053066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.054533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.054980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.056436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.057319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.058128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.058426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.058696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.058705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.061900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.062383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.063596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.063898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.064186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.064489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.064787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.066172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.066469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.066790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.066798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.070731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.071037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.072533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.072835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.073161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.073466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.073765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.075226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.075527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.075830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.075840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.081784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.083174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.084606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.084639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.084864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.086309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.087186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.087982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.088280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.088533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.088542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.093109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.094572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.096144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.097710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.097942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.098929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.099602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.099902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.101104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.101377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.101385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.105865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.105900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.107472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.107503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.107723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.108710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.110200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.110500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.110798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.111020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.111028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.114752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.114787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.116068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.117625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.117851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.118696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.118742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.120173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.120203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.120586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.332 [2024-06-10 10:26:02.120595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.122836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.124404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.125560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.125590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.125811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.127100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.127130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.127159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.128651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.128876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.128884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.131905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.131939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.131968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.131997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.132591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.136747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.136781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.136810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.136841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.137442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.139811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.139847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.139877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.139906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.140475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.144848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.144882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.144911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.144940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.145586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.148886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.149143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.149152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.153811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.154031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.154039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.157856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.162349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.162382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.333 [2024-06-10 10:26:02.162411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.162892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.163198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.163207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.167981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.168010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.168227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.168235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.170792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.170828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.170857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.170886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.171634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.175954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.176274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.176282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.181703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.181739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.181768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.181797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.182390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.186901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.186934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.186964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.186992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.187744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.189997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.190027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.190057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.190085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.190302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.334 [2024-06-10 10:26:02.190313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.194841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.195126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.195134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.198936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.198970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.198999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.199672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.202952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.202985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.203713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.207886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.208182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.208190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.210541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.210574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.598 [2024-06-10 10:26:02.210605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.210972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.211203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.211211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.215768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.215802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.215833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.215863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.216666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.220855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.220888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.220917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.220946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.221509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.223711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.223744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.223773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.223801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.224364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.228955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.232889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.236953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.236987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.237634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.242992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.243001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.245995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.599 [2024-06-10 10:26:02.246410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.246439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.246674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.246683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.250195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.250228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.250257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.251784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.256741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.257048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.257057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.260216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.261200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.261231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.262877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.263095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.263103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.265535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.267080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.267111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.267140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.267361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.267393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.268521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.268552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.270033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.270254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.270262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.274539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.274573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.274602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.275866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.276087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.276120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.277694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.278343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.278373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.278591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.278600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.282379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.283176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.284462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.286024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.286244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.287158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.288626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.290199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.291767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.292018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.292028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.295598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.297161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.297801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.299078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.299302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.300892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.302004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.302300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.302596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.302982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.302991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.307483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.308961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.309260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.309557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.309844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.310146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.310820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.312095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.313671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.313892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.313901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.318808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.600 [2024-06-10 10:26:02.319111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.319887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.320807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.321109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.321779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.323059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.324642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.326205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.326479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.326487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.330983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.331287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.332261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.333538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.333760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.335340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.336086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.337453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.339015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.339236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.339245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.344217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.345827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.347372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.347877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.348097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.349521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.351094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.352192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.352489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.352779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.352788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.355854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.356160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.356459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.356757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.357084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.357387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.357686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.357986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.358287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.358671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.358683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.361671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.361981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.362279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.362577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.362885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.363187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.363487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.363785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.364084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.364474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.364483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.366935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.367237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.367536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.367837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.368155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.368455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.368755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.369054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.369353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.369689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.369697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.372203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.372504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.372804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.373106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.373723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.374026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.374324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.374626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.374924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.375252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.375260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.377989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.378293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.378607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.378906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.379334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.379635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.379934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.380232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.380531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.380843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.380853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.383492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.383794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.384095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.384392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.384757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.385061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.385360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.601 [2024-06-10 10:26:02.385656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.385955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.386252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.386260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.388746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.389051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.389348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.389645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.389957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.390261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.390559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.390858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.391155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.391551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.391559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.394159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.394460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.394758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.395059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.395366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.395667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.395968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.396266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.396562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.396874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.396883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.399074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.399387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.399685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.399984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.400432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.400734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.401034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.401332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.401633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.401935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.401944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.403969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.404269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.404567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.404869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.405138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.405438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.405737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.406035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.406331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.406613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.406622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.408807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.409109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.409406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.409705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.409987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.410287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.410584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.410882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.411177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.411398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.411406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.413301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.413601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.413900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.414207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.414674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.414995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.415293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.415604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.415903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.416167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.416175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.418168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.418470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.418769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.419068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.419380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.419680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.419980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.420276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.420577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.420999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.421008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.602 [2024-06-10 10:26:02.423972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.425545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.427118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.427965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.428224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.429807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.431373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.432185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.432501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.432818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.436125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.437584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.438689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.439986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.440205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.441790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.442322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.442619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.442918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.443343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.443352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.446182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.447103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.448371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.449950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.450173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.451029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.451349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.451646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.451968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.452288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.452297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.454163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.455431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.456982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.458543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.458824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.459126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.459432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.459728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.460031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.460274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.603 [2024-06-10 10:26:02.460282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.463030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.464576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.466130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.467186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.467575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.467879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.468180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.468475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.469692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.469949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.469958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.472704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.474273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.475467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.475766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.476079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.476379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.476676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.477679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.478944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.479164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.479172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.482013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.483477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.483776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.484076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.484391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.484692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.867 [2024-06-10 10:26:02.485464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.486732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.488265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.488486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.488494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.491388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.491691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.491991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.492289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.492756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.493103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.494381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.495947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.497516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.497761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.497770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.499629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.499930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.500229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.500259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.500685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.500988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.502422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.503997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.505568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.505815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.505826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.507829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.508129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.508425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.508723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.509031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.510460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.511941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.513511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.514593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.514836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.514845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.516471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.516509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.516807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.516840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.517220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.517520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.518874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.520419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.521990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.522247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.522256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.524254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.524292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.524590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.524890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.525320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.525619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.525649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.527074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.527104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.527322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.527330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.528687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.530258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.531829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.531860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.532448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.532765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.532795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.532827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.533124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.533503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.533514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.534947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.534988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.535620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.536923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.536972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.868 [2024-06-10 10:26:02.537519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.537814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.537826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.539789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.539820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.539852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.539881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.540490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.541738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.541768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.541797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.541831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.542599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.544692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.546871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.548704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.548734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.548766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.548795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.549429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.550732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.550763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.550791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.550820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.551436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.554669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.556667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.558397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.558428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.558460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.869 [2024-06-10 10:26:02.558489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.558777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.558810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.558842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.558871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.558900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.559132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.559140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.560870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.561090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.561098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.562733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.562764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.562793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.562824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.563505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.564804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.564838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.564868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.564896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.565464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.567993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.569778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.570060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.570069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.571995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.572400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.572408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.573771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.573802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.573833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.573862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.574423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.575781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.575834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.575866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.575894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.576179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.576212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.576243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.870 [2024-06-10 10:26:02.576272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.576301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.576585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.576592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.578938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.580954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.582839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.583055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.583064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.584798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.585157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.585165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.587808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.589816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.592849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.594853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.871 [2024-06-10 10:26:02.596613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.596644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.596673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.597903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.599919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.601803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.602105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.602135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.603714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.603935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.603969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.603998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.604027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.604056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.604327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.604335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.605644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.605947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.605977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.606006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.606312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.606366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.606663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.606693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.607182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.607402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.607413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.609969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.610001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.610030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.611603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.611827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.611860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.612571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.612872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.612902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.613219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.613228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.616753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.618118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.619397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.620710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.620931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.622525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.622826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.623123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.623418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.623771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.623780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.626510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.627730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.629019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.630537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.630760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.631164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.631463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.631759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.632058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.632350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.632358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.872 [2024-06-10 10:26:02.633892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.635166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.636769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.638311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.638556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.638862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.639162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.639458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.639755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.640033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.640041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.642009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.642309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.642607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.642907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.643227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.643527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.643827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.644124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.644420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.644704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.644712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.646811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.647114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.647411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.647708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.648065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.648365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.648664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.648964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.649262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.649578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.649586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.651594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.651897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.652211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.652509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.652892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.653194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.653491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.653790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.654089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.654406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.654414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.656645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.656952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.657252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.657550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.657914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.658215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.658513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.658809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.659110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.659438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.659448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.661908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.662225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.662536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.662837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.663122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.663431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.663730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.664035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.664336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.664759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.664768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.666684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.667005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.667303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.667614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.667933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.668234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.668533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.668833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.669129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.669566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.669574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.671441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.671742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.672042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.672339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.672649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.672953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.673251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.673549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.673848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.674191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.674199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.676501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.676804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.677112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.677417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.677844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.873 [2024-06-10 10:26:02.678146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.678443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.678739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.679043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.679335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.679343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.682442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.682755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.683061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.683358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.683720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.684023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.684321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.684618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.684918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.685250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.685259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.687377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.687677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.687978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.688277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.688587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.688892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.689191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.689488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.689784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.690152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.690160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.692372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.692673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.692973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.693280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.693600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.693902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.694202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.694499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.694795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.695083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.695091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.697221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.698490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.698806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.700278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.700685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.700991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.701290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.701587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.701897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.702208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.702216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.704528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.704833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.705137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.705435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.705748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.706068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.706367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.706664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.706966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.707336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.707345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.709487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.709786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.710087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.710385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.710698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.711003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.711301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.711598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.711897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.712196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.712204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.715500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.716334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.717780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.719339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.719558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.720988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.721288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.721584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.721883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.722210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.722221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.724524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.726035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:40.874 [2024-06-10 10:26:02.727507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.729051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.729277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.729576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.729877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.730175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.730474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.730764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.730773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.732816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.734097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.735668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.737253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.737539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.737842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.738143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.738438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.738827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.739047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.739056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.741631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.743196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.744771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.745409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.745753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.746057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.746355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.746651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.748111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.748357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.748366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.751085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.752648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.753785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.754087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.144 [2024-06-10 10:26:02.754400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.754700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.755002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.756116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.757417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.757636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.757644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.760480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.761887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.762185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.762482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.762767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.763069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.763744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.765026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.766590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.766808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.766816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.769693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.769996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.770294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.770590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.770968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.771306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.772587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.774130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.775690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.775972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.775981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.777956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.778261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.778558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.778877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.779188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.780571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.782019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.783589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.784783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.785084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.785092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.786728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.787030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.787327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.787625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.787849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.789121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.790697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.792276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.793150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.793378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.793393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.795046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.795349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.795647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.796741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.796979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.798568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.800130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.800777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.802058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.802279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.802287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.804229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.804529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.805423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.805454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.805686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.807244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.808829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.809504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.810843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.811061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.811069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.813017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.813318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.813781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.815062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.815282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.816832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.818029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.819456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.820926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.821144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.821152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.823005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.823039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.823796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.823829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.824076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.825663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.827230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.827951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.829306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.829525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.829534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.831275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.831307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.831604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.832290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.832535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.834142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.834173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.835731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.835762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.836089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.836097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.837327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.837855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.838878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.839408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.839633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.839641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.840931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.840962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.840991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.841611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.843570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.843603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.843650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.843679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.844481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.845711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.845743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.845772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.845801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.846449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.847925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.847956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.847985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.145 [2024-06-10 10:26:02.848820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.848832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.850799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.852780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.853077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.853087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.854999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.855027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.855326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.855334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.856553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.856583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.856612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.856641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.857491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.859774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.861858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.864686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.865998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.866650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.868969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.869187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.869195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.870980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.871197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.871205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.872950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.872981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.873697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.874978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.875679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.877996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.878004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.879729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.880029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.880038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.146 [2024-06-10 10:26:02.881905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.881935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.881965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.882352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.882360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.883660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.883692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.883722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.883753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.883972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.884399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.885665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.885696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.885746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.885774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.886605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.888799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.890867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.892983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.893011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.893303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.893312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.894627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.894658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.894700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.894728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.895420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.897827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.899991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.901776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.901807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.901839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.903894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.905700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.906136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.906145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.907808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.909380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.909411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.910978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.911611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.912903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.913928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.914491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.914709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.914717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.147 [2024-06-10 10:26:02.917314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.917347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.917376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.918950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.919170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.919203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.919715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.920014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.920044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.920349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.920358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.923865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.925095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.926117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.927396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.927618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.929199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.929848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.930145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.930441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.930723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.930731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.932930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.933230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.933530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.933831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.934127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.934433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.934730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.935028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.935325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.935685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.935694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.937813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.938118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.938416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.938713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.939113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.939413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.939711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.940008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.940305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.940589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.940597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.943522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.943828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.944128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.944426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.944800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.945104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.945405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.945701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.946003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.946349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.946360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.948581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.948885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.949185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.949482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.949802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.950106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.950406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.950704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.951004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.951447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.951457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.953459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.953761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.954063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.954360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.954688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.954996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.955295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.955610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.955908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.956252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.956260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.958418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.958720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.959021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.959319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.959630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.959935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.960234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.960530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.960835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.961106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.961114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.963400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.963701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.964002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.964299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.964721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.965026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.965325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.965623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.965923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.966229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.966237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.968413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.968711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.969011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.969309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.969637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.969939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.970237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.970532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.970831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.971124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.971141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.973468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.973767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.974067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.974366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.974764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.975067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.975368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.975666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.975965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.976251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.976259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.978406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.978708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.979008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.979307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.979618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.979921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.980220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.980516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.980815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.981232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.981240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.983560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.983864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.984162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.984459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.984842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.985143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.985441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.985737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.986035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.986335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.986345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.989155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.989455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.989753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.990055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.990304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.991177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.992301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.992791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.993089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.993390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.993398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.996023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.996323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.996621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.996921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.997263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.148 [2024-06-10 10:26:02.997565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:02.997866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:02.998162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:02.998459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:02.998780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:02.998788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.000668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.000974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.001271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.001568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.001885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.002184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.002482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.002779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.003081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.003518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.003526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.149 [2024-06-10 10:26:03.005612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.007093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.008550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.010108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.010327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.011228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.012508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.014070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.015648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.016244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.016252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.020079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.021660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.023221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.024017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.024237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.025539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.027099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.028517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.028812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.029112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.029120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.032191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.033759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.034413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.035679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.035901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.412 [2024-06-10 10:26:03.037327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.038631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.038930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.039225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.039533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.039542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.042604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.043441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.044897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.046474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.046693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.048134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.048434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.048729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.049031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.049383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.049392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.051810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.053289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.054750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.056314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.056532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.056835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.057133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.057428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.057725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.058017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.058025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.060040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.061304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.062867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.064437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.064769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.065087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.065386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.065705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.066004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.066225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.066233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.068889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.070458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.072020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.072790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.073186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.073486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.073785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.074082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.075349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.075616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.075623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.078272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.079863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.081034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.081332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.081642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.081945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.082244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.083085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.084366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.084586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.084595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.087483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.088967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.089267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.089563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.089841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.090143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.090679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.091950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.093514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.093736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.093744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.096689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.097264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.097562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.097861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.098150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.098451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.099872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.101328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.102881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.103100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.103109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.105282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.105599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.105899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.106212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.106522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.107755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.109052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.110553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.111948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.112232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.413 [2024-06-10 10:26:03.112240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.113781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.114085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.114390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.114687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.114938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.116239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.117812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.119382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.120029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.120249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.120257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.121890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.122190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.122489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.122519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.122800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.124084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.125655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.127220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.127951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.128171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.128179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.129776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.130077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.130376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.130672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.130894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.132205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.133717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.135096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.136210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.136492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.136501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.138273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.138305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.138622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.138653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.138937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.140354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.141810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.143399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.144469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.144693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.144701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.146363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.146395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.146691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.147008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.147318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.148600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.148631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.150087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.150118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.150335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.150343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.151770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.153358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.154841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.155144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.155458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.155468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.156697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.156729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.156758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.156787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.157455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.158847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.158880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.158910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.158939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.159793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.161086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.161128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.414 [2024-06-10 10:26:03.161157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.161755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.163753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.164024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.164033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.165965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.165997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.166761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.168872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.170969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.172830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.173122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.173131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.175798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.177945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.415 [2024-06-10 10:26:03.180942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.180971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.181001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.181031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.181250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.181258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.182989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.183018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.183046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.183265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.183274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.185817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.187846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.189657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.189689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.189720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.189749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.190478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.191784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.191815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.191864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.191893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.192463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.194861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.196943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.198837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.416 [2024-06-10 10:26:03.199314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.199324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.200785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.200816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.200852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.200881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.201598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.203973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.204002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.204031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.204338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.204347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.205617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.205651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.205681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.205710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.206455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.207842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.207874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.207906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.207935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.208746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.210795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.212795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.213079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.213088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.214970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.417 [2024-06-10 10:26:03.215000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.215029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.215057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.215380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.215388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.216635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.216667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.216698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.216727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.217477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.219065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.219096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.219125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.220689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.220911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.220947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.220979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.221008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.221038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.221424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.221433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.222670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.222701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.222730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.222760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.223508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.225215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.226832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.226864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.228811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.229063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.229072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.230467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.230766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.230797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.230833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.231188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.231221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.231519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.231549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.231849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.232118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.232126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.234871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.234904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.234933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.235230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.235561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.235594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.235895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.236192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.236222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.236529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.236537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.238657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.238963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.239261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.239558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.239892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.240195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.240492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.240788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.241088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.241399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.241408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.243571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.243878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.244177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.244473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.244782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.245086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.245384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.245679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.245979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.246281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.246289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.248722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.418 [2024-06-10 10:26:03.249026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.249323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.249619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.249986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.250287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.250586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.250886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.251182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.251478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.251487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.253784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.254087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.254404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.254701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.255003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.255304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.255602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.255901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.256198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.256513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.256523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.258382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.258684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.258991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.259290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.259631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.259933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.260232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.260530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.260829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.261123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.261132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.263216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.263523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.263831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.265347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.265743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.266047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.267504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.267802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.268101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.268491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.268500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.271483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.271788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.272089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.272387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.272701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.274035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.419 [2024-06-10 10:26:03.274335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.682 [2024-06-10 10:26:03.274634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.682 [2024-06-10 10:26:03.276136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.682 [2024-06-10 10:26:03.276473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.276482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.278398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.278699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.280199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.280498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.280807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.281112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.281412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.282774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.283081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.283393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.283402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.285380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.286613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.287047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.287344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.287581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.288390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.288688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.288987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.289285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.289645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.289653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.291372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.291674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.291975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.292691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.292922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.293227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.293575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.294893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.295196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.295510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.295518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.297616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.298911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.299209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.299508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.299844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.300147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.301676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.301977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.302277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.302495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.302504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.306181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.306482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.306814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.308114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.308479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.308778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.309966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.310380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.311817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.312109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.312117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.313885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.314187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.314489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.315157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.315379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.315679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.315980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.317370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.317666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.317965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.317974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.319966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.321372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.321670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.321969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.322344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.322659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.324144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.324440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.324737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.324960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.324968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.327469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.327773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.328090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.328388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.328681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.328986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.329286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.329584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.329887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.330211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.330219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.683 [2024-06-10 10:26:03.332121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.332423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.333431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.334723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.334945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.336525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.337275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.338674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.340237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.340456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.340464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.342280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.343165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.344448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.345998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.346218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.347133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.348601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.350164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.351724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.351952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.351961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.355230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.356532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.358114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.359585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.359886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.361170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.362741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.364310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.364726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.365075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.365084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.368140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.369724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.370772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.372228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.372449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.374027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.375485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.375782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.376084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.376368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.376378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.379488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.380438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.381930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.383454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.383674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.385125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.385425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.385721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.386021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.386371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.386379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.388923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.390354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.391807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.393377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.393598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.393901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.394204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.394500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.394796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.395120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.395129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.397355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.398641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.400199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.401708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.402006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.402306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.402605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.402905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.403278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.403495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.403504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.406082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.407649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.409228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.409778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.410120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.410421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.410720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.411019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.412447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.412667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.412675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.415664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.417232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.418040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.418359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.418663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.684 [2024-06-10 10:26:03.418966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.419267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.420484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.421781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.422004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.422012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.424895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.425978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.426277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.426575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.426898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.427199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.428122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.429401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.430976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.431195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.431203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.433911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.434213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.434511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.434808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.435145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.435700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.436979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.438538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.440112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.440521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.440529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.442210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.442517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.442815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.443114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.443428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.444932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.446387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.447952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.449045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.449274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.449283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.450887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.451189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.451503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.451535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.451849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.453252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.454624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.456151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.457399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.457710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.457718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.459336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.459636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.459937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.460236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.460461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.461754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.463338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.464792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.465846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.466115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.466127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.467841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.467875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.468171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.468202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.468513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.469918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.471370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.472942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.474128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.474423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.474432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.476099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.476133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.476430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.476726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.477045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.478496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.478527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.480107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.480138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.480357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.480365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.481790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.483370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.483779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.483809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.484197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.484498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.484528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.685 [2024-06-10 10:26:03.484561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.484861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.485158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.485166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.486885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.487104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.487113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.488901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.488934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.488963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.488992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.489707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.491825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.493730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.494142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.494151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.495982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.496012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.496041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.496069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.496292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.496300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.497641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.497673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.497705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.497733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.498551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.500860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.502816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.503112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.503124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.686 [2024-06-10 10:26:03.504854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.504886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.504914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.504944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.505517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.506874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.506906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.506935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.506964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.507774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.509954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.511693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.512023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.512032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.513877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.513908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.513937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.513965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.514586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.515925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.515957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.515986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.516855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.519743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.687 [2024-06-10 10:26:03.522443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.522883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.523281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.523290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.525720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.526121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.526130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.529943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.530255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.530263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.533776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.535701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.535738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.535767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.535796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.536489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.538700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.538734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.538763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.538793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.539358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.541841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.542058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.542070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.544649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.544683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.544714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.544742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.545073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.545107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.545137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.545166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.688 [2024-06-10 10:26:03.545200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.950 [2024-06-10 10:26:03.545641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.950 [2024-06-10 10:26:03.545652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.547908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.548125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.548133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.551748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.552064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.552074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.553803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.553840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.553870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.553899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.554465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.557941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.558227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.558235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.561177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.561210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.561239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.562809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.565989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.566309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.566317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.568164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.568467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.568512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.568809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.569615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.571354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.571656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.571690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.571718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.572020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.572062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.572359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.572388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.572684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.573014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.573023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.575534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.951 [2024-06-10 10:26:03.575567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.575598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.575898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.576460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.576492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.576790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.577089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.577121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.577410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.577418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.579262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.579562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.579872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.580170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.580608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.580912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.581210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.581505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.581802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.582120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.582130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.584175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.584478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.584775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.585074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.585510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.585810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.586111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.586407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.586702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.586984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.586993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.589060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.589361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.589658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.589959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.590286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.590587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.590887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.591182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.591479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.591761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.591770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.593682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.593986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.594284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.594579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.594894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.595196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.595513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.595816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.596117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.596436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.596444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.598401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.598702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.599018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.599314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.599653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.599956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.600256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.600553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.600852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.601164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.601173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.602997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.603297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.603594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.603894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.604187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.604489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.604788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.605088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.605393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.605709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.605718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.607764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.608072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.608370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.608668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.608964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.609263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.609564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.609864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:41.952 [2024-06-10 10:26:03.612332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.612399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.614233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.614272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.677020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.678149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.678193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.679530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.952 [2024-06-10 10:26:03.681684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.683106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.684233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.685558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.685817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.686811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.686860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.688197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.688240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.689334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.689377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.690679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.690908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.690922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.690934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.690947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.693245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.694400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.695818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.697224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.698182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.699599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.700758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.702095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.702321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.702335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.702348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.702360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.704517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.705691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.707113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.708522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.709483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.710949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.712140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.713493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.713718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.713732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.713745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.713757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.716069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.717224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.718641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.720090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.720943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.722384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.723694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.724993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.725218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.725232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.725245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.725262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.727858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.729006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.730360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.731782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.732683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.733965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.735350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.736626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.736856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.736870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.736883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.736896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.739443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.740498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.741864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.743236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.744236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.745623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.746699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.748013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.748236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.748250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.748263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.748276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.750549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.751807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.753201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.754455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.755631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.756935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.758013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.759284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.759508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.759521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.759534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.759546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.761453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.762815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.764006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.765296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.953 [2024-06-10 10:26:03.766613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.767696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.768813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.770130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.770353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.770375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.770388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.770400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.772329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.773684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.774765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.776071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.777475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.778439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.779517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.780794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.781026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.781043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.781056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.781069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.782917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.784281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.785344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.786618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.788156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.788996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.790098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.791354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.791578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.791591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.791603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.791616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.793403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.794585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.795669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.796952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.797978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.799073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.800354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.801706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.801937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.801950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.801962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.801975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.804825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.805913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.807170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.808527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.809390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.810636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.811965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.813226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.813453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.813473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.813486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:41.954 [2024-06-10 10:26:03.813498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.815559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.816776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.818083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.818638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.819989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.821278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.822562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.823484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.823806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.823820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.823837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.823849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.826634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.827913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.828641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.829998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.831318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.831629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.831942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.832252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.832580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.832596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.832610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.832623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.835186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.835697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.836018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.836327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.836935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.837247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.837559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.837873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.838245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.838258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.838271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.838283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.840374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.215 [2024-06-10 10:26:03.840686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.840999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.841312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.842009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.842320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.842629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.842940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.843342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.843355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.843367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.843379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.845442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.845754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.846073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.846391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.847138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.847465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.847775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.848092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.848406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.848430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.848445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.848457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.850749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.851063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.851373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.851682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.852301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.852610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.852941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.853258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.853773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.853788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.853801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.853814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.855738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.856054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.856102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.856411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.857045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.857358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.857395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.857709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.858047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.858071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.858084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.858096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.860928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.861245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.861556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.861872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.862479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.862792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.863785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.865627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.865961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.866270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.866580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.867179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.867491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.867796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.868866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.872161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.873835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.873899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.216 [2024-06-10 10:26:03.876790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:30:42.477 00:30:42.477 Latency(us) 00:30:42.477 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:42.477 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x0 length 0x100 00:30:42.477 crypto_ram : 5.70 47.54 2.97 0.00 0.00 2617529.53 3024.74 2064888.12 00:30:42.477 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x100 length 0x100 00:30:42.477 crypto_ram : 5.72 44.79 2.80 0.00 0.00 2753003.52 535580.36 2193943.63 00:30:42.477 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x0 length 0x100 00:30:42.477 crypto_ram1 : 5.70 47.71 2.98 0.00 0.00 2521536.57 2596.23 1871304.86 00:30:42.477 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x100 length 0x100 00:30:42.477 crypto_ram1 : 5.73 47.32 2.96 0.00 0.00 2549195.86 819.20 2000360.37 00:30:42.477 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x0 length 0x100 00:30:42.477 crypto_ram2 : 5.56 340.57 21.29 0.00 0.00 339274.56 65737.65 529127.58 00:30:42.477 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x100 length 0x100 00:30:42.477 crypto_ram2 : 5.55 322.84 20.18 0.00 0.00 356320.61 66544.25 529127.58 00:30:42.477 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x0 length 0x100 00:30:42.477 crypto_ram3 : 5.64 351.25 21.95 0.00 0.00 320559.60 22483.89 425883.18 00:30:42.477 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:42.477 Verification LBA range: start 0x100 length 0x100 00:30:42.477 crypto_ram3 : 5.67 338.73 21.17 0.00 0.00 331068.97 37103.46 337157.51 00:30:42.477 =================================================================================================================== 00:30:42.477 Total : 1540.74 96.30 0.00 0.00 617140.57 819.20 2193943.63 00:30:42.737 00:30:42.737 real 0m8.559s 00:30:42.737 user 0m16.530s 00:30:42.737 sys 0m0.274s 00:30:42.737 10:26:04 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:42.737 10:26:04 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:30:42.737 ************************************ 00:30:42.737 END TEST bdev_verify_big_io 00:30:42.737 ************************************ 00:30:42.737 10:26:04 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:42.737 10:26:04 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:30:42.737 10:26:04 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:42.737 10:26:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:42.737 ************************************ 00:30:42.737 START TEST bdev_write_zeroes 00:30:42.737 ************************************ 00:30:42.737 10:26:04 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:42.737 [2024-06-10 10:26:04.574463] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:42.737 [2024-06-10 10:26:04.574509] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1184861 ] 00:30:42.997 [2024-06-10 10:26:04.663511] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:42.998 [2024-06-10 10:26:04.737958] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:42.998 [2024-06-10 10:26:04.759039] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:42.998 [2024-06-10 10:26:04.767069] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:42.998 [2024-06-10 10:26:04.775086] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:42.998 [2024-06-10 10:26:04.859177] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:45.543 [2024-06-10 10:26:07.014780] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:45.543 [2024-06-10 10:26:07.014838] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:45.543 [2024-06-10 10:26:07.014847] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:45.543 [2024-06-10 10:26:07.022798] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:45.543 [2024-06-10 10:26:07.022809] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:45.543 [2024-06-10 10:26:07.022815] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:45.543 [2024-06-10 10:26:07.030818] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:45.543 [2024-06-10 10:26:07.030835] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:45.543 [2024-06-10 10:26:07.030840] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:45.543 [2024-06-10 10:26:07.038889] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:45.543 [2024-06-10 10:26:07.038911] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:45.543 [2024-06-10 10:26:07.038917] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:45.543 Running I/O for 1 seconds... 00:30:46.485 00:30:46.485 Latency(us) 00:30:46.485 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:46.485 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:46.485 crypto_ram : 1.02 2382.39 9.31 0.00 0.00 53391.99 4814.38 64931.05 00:30:46.485 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:46.485 crypto_ram1 : 1.02 2387.95 9.33 0.00 0.00 53000.43 4763.96 60091.47 00:30:46.485 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:46.485 crypto_ram2 : 1.02 18444.58 72.05 0.00 0.00 6853.39 2129.92 9124.63 00:30:46.485 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:46.485 crypto_ram3 : 1.02 18422.62 71.96 0.00 0.00 6834.91 2117.32 7108.14 00:30:46.485 =================================================================================================================== 00:30:46.485 Total : 41637.55 162.65 0.00 0.00 12173.24 2117.32 64931.05 00:30:46.746 00:30:46.746 real 0m3.862s 00:30:46.746 user 0m3.567s 00:30:46.746 sys 0m0.257s 00:30:46.746 10:26:08 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:46.746 10:26:08 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:30:46.746 ************************************ 00:30:46.746 END TEST bdev_write_zeroes 00:30:46.746 ************************************ 00:30:46.746 10:26:08 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:46.746 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:30:46.746 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:46.746 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:46.746 ************************************ 00:30:46.746 START TEST bdev_json_nonenclosed 00:30:46.746 ************************************ 00:30:46.746 10:26:08 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:46.746 [2024-06-10 10:26:08.515610] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:46.746 [2024-06-10 10:26:08.515651] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1185485 ] 00:30:46.746 [2024-06-10 10:26:08.601134] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:47.007 [2024-06-10 10:26:08.665169] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:47.007 [2024-06-10 10:26:08.665221] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:30:47.007 [2024-06-10 10:26:08.665232] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:47.007 [2024-06-10 10:26:08.665238] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:47.007 00:30:47.007 real 0m0.269s 00:30:47.007 user 0m0.174s 00:30:47.007 sys 0m0.093s 00:30:47.007 10:26:08 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:47.007 10:26:08 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:30:47.007 ************************************ 00:30:47.007 END TEST bdev_json_nonenclosed 00:30:47.007 ************************************ 00:30:47.007 10:26:08 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:47.007 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:30:47.007 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:47.007 10:26:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:47.007 ************************************ 00:30:47.007 START TEST bdev_json_nonarray 00:30:47.007 ************************************ 00:30:47.007 10:26:08 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:47.007 [2024-06-10 10:26:08.856106] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:47.007 [2024-06-10 10:26:08.856152] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1185529 ] 00:30:47.268 [2024-06-10 10:26:08.944322] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:47.268 [2024-06-10 10:26:09.020545] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:47.268 [2024-06-10 10:26:09.020606] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:30:47.268 [2024-06-10 10:26:09.020618] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:47.268 [2024-06-10 10:26:09.020625] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:47.268 00:30:47.268 real 0m0.286s 00:30:47.268 user 0m0.180s 00:30:47.268 sys 0m0.104s 00:30:47.268 10:26:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:47.268 10:26:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:30:47.268 ************************************ 00:30:47.268 END TEST bdev_json_nonarray 00:30:47.268 ************************************ 00:30:47.268 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:30:47.269 10:26:09 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:30:47.269 00:30:47.269 real 1m7.205s 00:30:47.269 user 2m42.731s 00:30:47.269 sys 0m6.005s 00:30:47.269 10:26:09 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:47.269 10:26:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:47.269 ************************************ 00:30:47.269 END TEST blockdev_crypto_qat 00:30:47.269 ************************************ 00:30:47.530 10:26:09 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:30:47.530 10:26:09 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:30:47.530 10:26:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:47.530 10:26:09 -- common/autotest_common.sh@10 -- # set +x 00:30:47.530 ************************************ 00:30:47.530 START TEST chaining 00:30:47.530 ************************************ 00:30:47.530 10:26:09 chaining -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:30:47.530 * Looking for test storage... 00:30:47.530 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@7 -- # uname -s 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:47.530 10:26:09 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:47.530 10:26:09 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:47.530 10:26:09 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:47.530 10:26:09 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:47.530 10:26:09 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:47.530 10:26:09 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:47.530 10:26:09 chaining -- paths/export.sh@5 -- # export PATH 00:30:47.530 10:26:09 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@47 -- # : 0 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:30:47.530 10:26:09 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:47.530 10:26:09 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:47.530 10:26:09 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:30:47.530 10:26:09 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:30:47.530 10:26:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@296 -- # e810=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@297 -- # x722=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@298 -- # mlx=() 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:30:55.675 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:30:55.675 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:30:55.675 Found net devices under 0000:4b:00.0: cvl_0_0 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:30:55.675 Found net devices under 0000:4b:00.1: cvl_0_1 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:30:55.675 10:26:17 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:30:55.934 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:30:55.934 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.574 ms 00:30:55.934 00:30:55.934 --- 10.0.0.2 ping statistics --- 00:30:55.934 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:55.934 rtt min/avg/max/mdev = 0.574/0.574/0.574/0.000 ms 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:30:55.934 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:30:55.934 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.321 ms 00:30:55.934 00:30:55.934 --- 10.0.0.1 ping statistics --- 00:30:55.934 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:55.934 rtt min/avg/max/mdev = 0.321/0.321/0.321/0.000 ms 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@422 -- # return 0 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:30:55.934 10:26:17 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@481 -- # nvmfpid=1189764 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@482 -- # waitforlisten 1189764 00:30:55.934 10:26:17 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@830 -- # '[' -z 1189764 ']' 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:55.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:55.934 10:26:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:55.934 [2024-06-10 10:26:17.795971] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:55.934 [2024-06-10 10:26:17.796039] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:56.194 [2024-06-10 10:26:17.876553] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:56.194 [2024-06-10 10:26:17.947385] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:30:56.194 [2024-06-10 10:26:17.947421] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:30:56.194 [2024-06-10 10:26:17.947429] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:30:56.194 [2024-06-10 10:26:17.947435] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:30:56.194 [2024-06-10 10:26:17.947440] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:30:56.194 [2024-06-10 10:26:17.947457] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:56.764 10:26:18 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:56.764 10:26:18 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:56.764 10:26:18 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:30:56.764 10:26:18 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:30:56.764 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.025 10:26:18 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@69 -- # mktemp 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.91TQisj7sI 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@69 -- # mktemp 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.0IFSGyEME5 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.025 malloc0 00:30:57.025 true 00:30:57.025 true 00:30:57.025 [2024-06-10 10:26:18.710753] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:57.025 crypto0 00:30:57.025 [2024-06-10 10:26:18.718775] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:57.025 crypto1 00:30:57.025 [2024-06-10 10:26:18.726871] tcp.c: 716:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:30:57.025 [2024-06-10 10:26:18.743039] tcp.c:1031:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@85 -- # update_stats 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.025 10:26:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:57.025 10:26:18 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:57.287 10:26:18 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.287 10:26:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:57.287 10:26:18 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.91TQisj7sI bs=1K count=64 00:30:57.287 64+0 records in 00:30:57.287 64+0 records out 00:30:57.287 65536 bytes (66 kB, 64 KiB) copied, 0.000992798 s, 66.0 MB/s 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.91TQisj7sI --ob Nvme0n1 --bs 65536 --count 1 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@25 -- # local config 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:57.287 10:26:18 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:57.287 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:57.287 10:26:19 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:57.287 "subsystems": [ 00:30:57.287 { 00:30:57.287 "subsystem": "bdev", 00:30:57.287 "config": [ 00:30:57.287 { 00:30:57.287 "method": "bdev_nvme_attach_controller", 00:30:57.287 "params": { 00:30:57.287 "trtype": "tcp", 00:30:57.287 "adrfam": "IPv4", 00:30:57.287 "name": "Nvme0", 00:30:57.287 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:57.287 "traddr": "10.0.0.2", 00:30:57.287 "trsvcid": "4420" 00:30:57.287 } 00:30:57.287 }, 00:30:57.287 { 00:30:57.287 "method": "bdev_set_options", 00:30:57.287 "params": { 00:30:57.287 "bdev_auto_examine": false 00:30:57.287 } 00:30:57.287 } 00:30:57.287 ] 00:30:57.287 } 00:30:57.287 ] 00:30:57.287 }' 00:30:57.287 10:26:19 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.91TQisj7sI --ob Nvme0n1 --bs 65536 --count 1 00:30:57.287 10:26:19 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:57.287 "subsystems": [ 00:30:57.287 { 00:30:57.287 "subsystem": "bdev", 00:30:57.287 "config": [ 00:30:57.287 { 00:30:57.287 "method": "bdev_nvme_attach_controller", 00:30:57.287 "params": { 00:30:57.287 "trtype": "tcp", 00:30:57.287 "adrfam": "IPv4", 00:30:57.287 "name": "Nvme0", 00:30:57.287 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:57.287 "traddr": "10.0.0.2", 00:30:57.287 "trsvcid": "4420" 00:30:57.287 } 00:30:57.287 }, 00:30:57.287 { 00:30:57.287 "method": "bdev_set_options", 00:30:57.287 "params": { 00:30:57.287 "bdev_auto_examine": false 00:30:57.287 } 00:30:57.287 } 00:30:57.287 ] 00:30:57.287 } 00:30:57.287 ] 00:30:57.287 }' 00:30:57.287 [2024-06-10 10:26:19.058213] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:57.287 [2024-06-10 10:26:19.058256] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1190042 ] 00:30:57.287 [2024-06-10 10:26:19.143166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:57.549 [2024-06-10 10:26:19.205493] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:57.809  Copying: 64/64 [kB] (average 15 MBps) 00:30:57.809 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:57.809 10:26:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:57.809 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:57.809 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@96 -- # update_stats 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.070 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.070 10:26:19 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:30:58.331 10:26:19 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:58.332 10:26:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:58.332 10:26:19 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.0IFSGyEME5 --ib Nvme0n1 --bs 65536 --count 1 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@25 -- # local config 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:58.332 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:58.332 "subsystems": [ 00:30:58.332 { 00:30:58.332 "subsystem": "bdev", 00:30:58.332 "config": [ 00:30:58.332 { 00:30:58.332 "method": "bdev_nvme_attach_controller", 00:30:58.332 "params": { 00:30:58.332 "trtype": "tcp", 00:30:58.332 "adrfam": "IPv4", 00:30:58.332 "name": "Nvme0", 00:30:58.332 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:58.332 "traddr": "10.0.0.2", 00:30:58.332 "trsvcid": "4420" 00:30:58.332 } 00:30:58.332 }, 00:30:58.332 { 00:30:58.332 "method": "bdev_set_options", 00:30:58.332 "params": { 00:30:58.332 "bdev_auto_examine": false 00:30:58.332 } 00:30:58.332 } 00:30:58.332 ] 00:30:58.332 } 00:30:58.332 ] 00:30:58.332 }' 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.0IFSGyEME5 --ib Nvme0n1 --bs 65536 --count 1 00:30:58.332 10:26:20 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:58.332 "subsystems": [ 00:30:58.332 { 00:30:58.332 "subsystem": "bdev", 00:30:58.332 "config": [ 00:30:58.332 { 00:30:58.332 "method": "bdev_nvme_attach_controller", 00:30:58.332 "params": { 00:30:58.332 "trtype": "tcp", 00:30:58.332 "adrfam": "IPv4", 00:30:58.332 "name": "Nvme0", 00:30:58.332 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:58.332 "traddr": "10.0.0.2", 00:30:58.332 "trsvcid": "4420" 00:30:58.332 } 00:30:58.332 }, 00:30:58.332 { 00:30:58.332 "method": "bdev_set_options", 00:30:58.332 "params": { 00:30:58.332 "bdev_auto_examine": false 00:30:58.332 } 00:30:58.332 } 00:30:58.332 ] 00:30:58.332 } 00:30:58.332 ] 00:30:58.332 }' 00:30:58.332 [2024-06-10 10:26:20.112520] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:58.332 [2024-06-10 10:26:20.112569] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1190165 ] 00:30:58.592 [2024-06-10 10:26:20.197402] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:58.592 [2024-06-10 10:26:20.259731] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:59.115  Copying: 64/64 [kB] (average 6400 kBps) 00:30:59.115 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.115 10:26:20 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.91TQisj7sI /tmp/tmp.0IFSGyEME5 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@25 -- # local config 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:59.115 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:59.115 10:26:20 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:59.115 "subsystems": [ 00:30:59.115 { 00:30:59.116 "subsystem": "bdev", 00:30:59.116 "config": [ 00:30:59.116 { 00:30:59.116 "method": "bdev_nvme_attach_controller", 00:30:59.116 "params": { 00:30:59.116 "trtype": "tcp", 00:30:59.116 "adrfam": "IPv4", 00:30:59.116 "name": "Nvme0", 00:30:59.116 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:59.116 "traddr": "10.0.0.2", 00:30:59.116 "trsvcid": "4420" 00:30:59.116 } 00:30:59.116 }, 00:30:59.116 { 00:30:59.116 "method": "bdev_set_options", 00:30:59.116 "params": { 00:30:59.116 "bdev_auto_examine": false 00:30:59.116 } 00:30:59.116 } 00:30:59.116 ] 00:30:59.116 } 00:30:59.116 ] 00:30:59.116 }' 00:30:59.116 10:26:20 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:59.116 10:26:20 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:59.116 "subsystems": [ 00:30:59.116 { 00:30:59.116 "subsystem": "bdev", 00:30:59.116 "config": [ 00:30:59.116 { 00:30:59.116 "method": "bdev_nvme_attach_controller", 00:30:59.116 "params": { 00:30:59.116 "trtype": "tcp", 00:30:59.116 "adrfam": "IPv4", 00:30:59.116 "name": "Nvme0", 00:30:59.116 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:59.116 "traddr": "10.0.0.2", 00:30:59.116 "trsvcid": "4420" 00:30:59.116 } 00:30:59.116 }, 00:30:59.116 { 00:30:59.116 "method": "bdev_set_options", 00:30:59.116 "params": { 00:30:59.116 "bdev_auto_examine": false 00:30:59.116 } 00:30:59.116 } 00:30:59.116 ] 00:30:59.116 } 00:30:59.116 ] 00:30:59.116 }' 00:30:59.376 [2024-06-10 10:26:21.015378] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:59.376 [2024-06-10 10:26:21.015424] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1190406 ] 00:30:59.376 [2024-06-10 10:26:21.100139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:59.376 [2024-06-10 10:26:21.162374] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:59.636  Copying: 64/64 [kB] (average 15 MBps) 00:30:59.636 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@106 -- # update_stats 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:59.636 10:26:21 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:59.636 10:26:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.636 10:26:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.636 10:26:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:59.897 10:26:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.91TQisj7sI --ob Nvme0n1 --bs 4096 --count 16 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@25 -- # local config 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:59.897 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:59.897 "subsystems": [ 00:30:59.897 { 00:30:59.897 "subsystem": "bdev", 00:30:59.897 "config": [ 00:30:59.897 { 00:30:59.897 "method": "bdev_nvme_attach_controller", 00:30:59.897 "params": { 00:30:59.897 "trtype": "tcp", 00:30:59.897 "adrfam": "IPv4", 00:30:59.897 "name": "Nvme0", 00:30:59.897 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:59.897 "traddr": "10.0.0.2", 00:30:59.897 "trsvcid": "4420" 00:30:59.897 } 00:30:59.897 }, 00:30:59.897 { 00:30:59.897 "method": "bdev_set_options", 00:30:59.897 "params": { 00:30:59.897 "bdev_auto_examine": false 00:30:59.897 } 00:30:59.897 } 00:30:59.897 ] 00:30:59.897 } 00:30:59.897 ] 00:30:59.897 }' 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.91TQisj7sI --ob Nvme0n1 --bs 4096 --count 16 00:30:59.897 10:26:21 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:59.897 "subsystems": [ 00:30:59.897 { 00:30:59.897 "subsystem": "bdev", 00:30:59.897 "config": [ 00:30:59.897 { 00:30:59.897 "method": "bdev_nvme_attach_controller", 00:30:59.897 "params": { 00:30:59.897 "trtype": "tcp", 00:30:59.897 "adrfam": "IPv4", 00:30:59.897 "name": "Nvme0", 00:30:59.897 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:59.897 "traddr": "10.0.0.2", 00:30:59.897 "trsvcid": "4420" 00:30:59.897 } 00:30:59.897 }, 00:30:59.897 { 00:30:59.897 "method": "bdev_set_options", 00:30:59.897 "params": { 00:30:59.897 "bdev_auto_examine": false 00:30:59.897 } 00:30:59.897 } 00:30:59.897 ] 00:30:59.897 } 00:30:59.897 ] 00:30:59.897 }' 00:30:59.897 [2024-06-10 10:26:21.736467] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:30:59.897 [2024-06-10 10:26:21.736515] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1190514 ] 00:31:00.158 [2024-06-10 10:26:21.821796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:00.158 [2024-06-10 10:26:21.883366] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:00.680  Copying: 64/64 [kB] (average 15 MBps) 00:31:00.680 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@114 -- # update_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:00.680 10:26:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.680 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.941 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.941 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.941 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:00.941 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.941 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:00.941 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.942 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.942 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:00.942 10:26:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:00.942 10:26:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:00.942 10:26:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@117 -- # : 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.0IFSGyEME5 --ib Nvme0n1 --bs 4096 --count 16 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@25 -- # local config 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:00.942 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:00.942 "subsystems": [ 00:31:00.942 { 00:31:00.942 "subsystem": "bdev", 00:31:00.942 "config": [ 00:31:00.942 { 00:31:00.942 "method": "bdev_nvme_attach_controller", 00:31:00.942 "params": { 00:31:00.942 "trtype": "tcp", 00:31:00.942 "adrfam": "IPv4", 00:31:00.942 "name": "Nvme0", 00:31:00.942 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:00.942 "traddr": "10.0.0.2", 00:31:00.942 "trsvcid": "4420" 00:31:00.942 } 00:31:00.942 }, 00:31:00.942 { 00:31:00.942 "method": "bdev_set_options", 00:31:00.942 "params": { 00:31:00.942 "bdev_auto_examine": false 00:31:00.942 } 00:31:00.942 } 00:31:00.942 ] 00:31:00.942 } 00:31:00.942 ] 00:31:00.942 }' 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.0IFSGyEME5 --ib Nvme0n1 --bs 4096 --count 16 00:31:00.942 10:26:22 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:00.942 "subsystems": [ 00:31:00.942 { 00:31:00.942 "subsystem": "bdev", 00:31:00.942 "config": [ 00:31:00.942 { 00:31:00.942 "method": "bdev_nvme_attach_controller", 00:31:00.942 "params": { 00:31:00.942 "trtype": "tcp", 00:31:00.942 "adrfam": "IPv4", 00:31:00.942 "name": "Nvme0", 00:31:00.942 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:00.942 "traddr": "10.0.0.2", 00:31:00.942 "trsvcid": "4420" 00:31:00.942 } 00:31:00.942 }, 00:31:00.942 { 00:31:00.942 "method": "bdev_set_options", 00:31:00.942 "params": { 00:31:00.942 "bdev_auto_examine": false 00:31:00.942 } 00:31:00.942 } 00:31:00.942 ] 00:31:00.942 } 00:31:00.942 ] 00:31:00.942 }' 00:31:01.203 [2024-06-10 10:26:22.828898] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:01.203 [2024-06-10 10:26:22.828945] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1190738 ] 00:31:01.203 [2024-06-10 10:26:22.914786] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:01.203 [2024-06-10 10:26:22.977307] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:01.724  Copying: 64/64 [kB] (average 673 kBps) 00:31:01.724 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:01.724 10:26:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:01.724 10:26:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:01.725 10:26:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.91TQisj7sI /tmp/tmp.0IFSGyEME5 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.91TQisj7sI /tmp/tmp.0IFSGyEME5 00:31:01.725 10:26:23 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@117 -- # sync 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@120 -- # set +e 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:31:01.725 10:26:23 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:31:01.725 rmmod nvme_tcp 00:31:01.725 rmmod nvme_fabrics 00:31:01.725 rmmod nvme_keyring 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@124 -- # set -e 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@125 -- # return 0 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@489 -- # '[' -n 1189764 ']' 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@490 -- # killprocess 1189764 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@949 -- # '[' -z 1189764 ']' 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@953 -- # kill -0 1189764 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@954 -- # uname 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1189764 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1189764' 00:31:01.986 killing process with pid 1189764 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@968 -- # kill 1189764 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@973 -- # wait 1189764 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:31:01.986 10:26:23 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:01.986 10:26:23 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:04.586 10:26:25 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:31:04.586 10:26:25 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:04.586 10:26:25 chaining -- bdev/chaining.sh@132 -- # bperfpid=1191222 00:31:04.586 10:26:25 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1191222 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@830 -- # '[' -z 1191222 ']' 00:31:04.586 10:26:25 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:04.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:04.586 10:26:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:04.586 [2024-06-10 10:26:25.926018] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:04.586 [2024-06-10 10:26:25.926079] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1191222 ] 00:31:04.586 [2024-06-10 10:26:26.017418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:04.586 [2024-06-10 10:26:26.110516] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:05.158 10:26:26 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:05.158 10:26:26 chaining -- common/autotest_common.sh@863 -- # return 0 00:31:05.158 10:26:26 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:31:05.158 10:26:26 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:05.158 10:26:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:05.158 malloc0 00:31:05.158 true 00:31:05.158 true 00:31:05.158 [2024-06-10 10:26:26.917591] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:05.158 crypto0 00:31:05.158 [2024-06-10 10:26:26.925615] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:05.158 crypto1 00:31:05.158 10:26:26 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:05.158 10:26:26 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:05.420 Running I/O for 5 seconds... 00:31:10.763 00:31:10.763 Latency(us) 00:31:10.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:10.763 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:10.763 Verification LBA range: start 0x0 length 0x2000 00:31:10.763 crypto1 : 5.01 14273.45 55.76 0.00 0.00 17886.96 1083.86 11695.66 00:31:10.763 =================================================================================================================== 00:31:10.763 Total : 14273.45 55.76 0.00 0.00 17886.96 1083.86 11695.66 00:31:10.763 0 00:31:10.763 10:26:32 chaining -- bdev/chaining.sh@146 -- # killprocess 1191222 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@949 -- # '[' -z 1191222 ']' 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@953 -- # kill -0 1191222 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@954 -- # uname 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1191222 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1191222' 00:31:10.763 killing process with pid 1191222 00:31:10.763 10:26:32 chaining -- common/autotest_common.sh@968 -- # kill 1191222 00:31:10.763 Received shutdown signal, test time was about 5.000000 seconds 00:31:10.763 00:31:10.763 Latency(us) 00:31:10.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:10.764 =================================================================================================================== 00:31:10.764 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@973 -- # wait 1191222 00:31:10.764 10:26:32 chaining -- bdev/chaining.sh@152 -- # bperfpid=1192345 00:31:10.764 10:26:32 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1192345 00:31:10.764 10:26:32 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@830 -- # '[' -z 1192345 ']' 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:10.764 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:10.764 10:26:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:10.764 [2024-06-10 10:26:32.304602] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:10.764 [2024-06-10 10:26:32.304653] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1192345 ] 00:31:10.764 [2024-06-10 10:26:32.391374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:10.764 [2024-06-10 10:26:32.454000] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:11.334 10:26:33 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:11.334 10:26:33 chaining -- common/autotest_common.sh@863 -- # return 0 00:31:11.334 10:26:33 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:31:11.334 10:26:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:11.334 10:26:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:11.593 malloc0 00:31:11.593 true 00:31:11.593 true 00:31:11.593 [2024-06-10 10:26:33.233002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:31:11.593 [2024-06-10 10:26:33.233038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:11.594 [2024-06-10 10:26:33.233051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265dae0 00:31:11.594 [2024-06-10 10:26:33.233057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:11.594 [2024-06-10 10:26:33.233922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:11.594 [2024-06-10 10:26:33.233938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:31:11.594 pt0 00:31:11.594 [2024-06-10 10:26:33.241029] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:11.594 crypto0 00:31:11.594 [2024-06-10 10:26:33.249047] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:11.594 crypto1 00:31:11.594 10:26:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:11.594 10:26:33 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:11.594 Running I/O for 5 seconds... 00:31:16.883 00:31:16.883 Latency(us) 00:31:16.883 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.883 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:16.883 Verification LBA range: start 0x0 length 0x2000 00:31:16.883 crypto1 : 5.01 11294.17 44.12 0.00 0.00 22612.68 5268.09 13510.50 00:31:16.883 =================================================================================================================== 00:31:16.883 Total : 11294.17 44.12 0.00 0.00 22612.68 5268.09 13510.50 00:31:16.883 0 00:31:16.883 10:26:38 chaining -- bdev/chaining.sh@167 -- # killprocess 1192345 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@949 -- # '[' -z 1192345 ']' 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@953 -- # kill -0 1192345 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@954 -- # uname 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1192345 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1192345' 00:31:16.883 killing process with pid 1192345 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@968 -- # kill 1192345 00:31:16.883 Received shutdown signal, test time was about 5.000000 seconds 00:31:16.883 00:31:16.883 Latency(us) 00:31:16.883 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.883 =================================================================================================================== 00:31:16.883 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@973 -- # wait 1192345 00:31:16.883 10:26:38 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:31:16.883 10:26:38 chaining -- bdev/chaining.sh@170 -- # killprocess 1192345 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@949 -- # '[' -z 1192345 ']' 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@953 -- # kill -0 1192345 00:31:16.883 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 953: kill: (1192345) - No such process 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@976 -- # echo 'Process with pid 1192345 is not found' 00:31:16.883 Process with pid 1192345 is not found 00:31:16.883 10:26:38 chaining -- bdev/chaining.sh@171 -- # wait 1192345 00:31:16.883 10:26:38 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:16.883 10:26:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:16.883 10:26:38 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:31:16.884 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:31:16.884 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:31:16.884 Found net devices under 0000:4b:00.0: cvl_0_0 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:31:16.884 Found net devices under 0000:4b:00.1: cvl_0_1 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:31:16.884 10:26:38 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:31:17.146 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:17.146 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.582 ms 00:31:17.146 00:31:17.146 --- 10.0.0.2 ping statistics --- 00:31:17.146 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:17.146 rtt min/avg/max/mdev = 0.582/0.582/0.582/0.000 ms 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:31:17.146 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:17.146 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.274 ms 00:31:17.146 00:31:17.146 --- 10.0.0.1 ping statistics --- 00:31:17.146 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:17.146 rtt min/avg/max/mdev = 0.274/0.274/0.274/0.000 ms 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@422 -- # return 0 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:17.146 10:26:38 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@481 -- # nvmfpid=1193392 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@482 -- # waitforlisten 1193392 00:31:17.146 10:26:38 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@830 -- # '[' -z 1193392 ']' 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:17.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:17.146 10:26:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.146 [2024-06-10 10:26:38.979399] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:17.146 [2024-06-10 10:26:38.979462] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:17.407 [2024-06-10 10:26:39.059962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:17.407 [2024-06-10 10:26:39.129617] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:17.407 [2024-06-10 10:26:39.129654] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:17.407 [2024-06-10 10:26:39.129661] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:17.407 [2024-06-10 10:26:39.129667] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:17.407 [2024-06-10 10:26:39.129672] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:17.407 [2024-06-10 10:26:39.129690] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:31:17.978 10:26:39 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:17.978 10:26:39 chaining -- common/autotest_common.sh@863 -- # return 0 00:31:17.978 10:26:39 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:17.978 10:26:39 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:31:17.978 10:26:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.239 10:26:39 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:18.239 10:26:39 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.239 malloc0 00:31:18.239 [2024-06-10 10:26:39.868167] tcp.c: 716:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:18.239 [2024-06-10 10:26:39.884326] tcp.c:1031:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:18.239 10:26:39 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:31:18.239 10:26:39 chaining -- bdev/chaining.sh@189 -- # bperfpid=1193704 00:31:18.239 10:26:39 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1193704 /var/tmp/bperf.sock 00:31:18.239 10:26:39 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@830 -- # '[' -z 1193704 ']' 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:31:18.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:18.239 10:26:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.239 [2024-06-10 10:26:39.948707] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:18.239 [2024-06-10 10:26:39.948748] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1193704 ] 00:31:18.239 [2024-06-10 10:26:40.034009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:18.239 [2024-06-10 10:26:40.097461] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:19.179 10:26:40 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:19.179 10:26:40 chaining -- common/autotest_common.sh@863 -- # return 0 00:31:19.179 10:26:40 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:31:19.179 10:26:40 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:31:19.439 [2024-06-10 10:26:41.103899] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:19.439 nvme0n1 00:31:19.439 true 00:31:19.439 crypto0 00:31:19.439 10:26:41 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:31:19.439 Running I/O for 5 seconds... 00:31:24.732 00:31:24.732 Latency(us) 00:31:24.732 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.732 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:24.732 Verification LBA range: start 0x0 length 0x2000 00:31:24.733 crypto0 : 5.02 10752.38 42.00 0.00 0.00 23740.41 4537.11 19963.27 00:31:24.733 =================================================================================================================== 00:31:24.733 Total : 10752.38 42.00 0.00 0.00 23740.41 4537.11 19963.27 00:31:24.733 0 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@205 -- # sequence=107908 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:24.733 10:26:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@206 -- # encrypt=53954 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:24.994 10:26:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@207 -- # decrypt=53954 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:31:25.254 10:26:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:25.254 10:26:47 chaining -- bdev/chaining.sh@208 -- # crc32c=107908 00:31:25.254 10:26:47 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:31:25.254 10:26:47 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:31:25.254 10:26:47 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:31:25.254 10:26:47 chaining -- bdev/chaining.sh@214 -- # killprocess 1193704 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@949 -- # '[' -z 1193704 ']' 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@953 -- # kill -0 1193704 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@954 -- # uname 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1193704 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1193704' 00:31:25.254 killing process with pid 1193704 00:31:25.254 10:26:47 chaining -- common/autotest_common.sh@968 -- # kill 1193704 00:31:25.254 Received shutdown signal, test time was about 5.000000 seconds 00:31:25.254 00:31:25.254 Latency(us) 00:31:25.254 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:25.254 =================================================================================================================== 00:31:25.254 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@973 -- # wait 1193704 00:31:25.514 10:26:47 chaining -- bdev/chaining.sh@219 -- # bperfpid=1194933 00:31:25.514 10:26:47 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1194933 /var/tmp/bperf.sock 00:31:25.514 10:26:47 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@830 -- # '[' -z 1194933 ']' 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:31:25.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:25.514 10:26:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:25.514 [2024-06-10 10:26:47.298053] Starting SPDK v24.09-pre git sha1 3a44739b7 / DPDK 24.03.0 initialization... 00:31:25.514 [2024-06-10 10:26:47.298100] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1194933 ] 00:31:25.774 [2024-06-10 10:26:47.381842] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:25.774 [2024-06-10 10:26:47.444013] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.344 10:26:48 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:26.344 10:26:48 chaining -- common/autotest_common.sh@863 -- # return 0 00:31:26.344 10:26:48 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:31:26.344 10:26:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:31:26.604 [2024-06-10 10:26:48.465256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:26.865 nvme0n1 00:31:26.865 true 00:31:26.865 crypto0 00:31:26.865 10:26:48 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:31:26.865 Running I/O for 5 seconds... 00:31:32.150 00:31:32.151 Latency(us) 00:31:32.151 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:32.151 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:31:32.151 Verification LBA range: start 0x0 length 0x200 00:31:32.151 crypto0 : 5.01 2302.67 143.92 0.00 0.00 13600.76 529.33 17140.18 00:31:32.151 =================================================================================================================== 00:31:32.151 Total : 2302.67 143.92 0.00 0.00 13600.76 529.33 17140.18 00:31:32.151 0 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@233 -- # sequence=23056 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:32.151 10:26:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@234 -- # encrypt=11528 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:32.410 10:26:54 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@235 -- # decrypt=11528 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:31:32.411 10:26:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:32.670 10:26:54 chaining -- bdev/chaining.sh@236 -- # crc32c=23056 00:31:32.670 10:26:54 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:31:32.670 10:26:54 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:31:32.670 10:26:54 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:31:32.670 10:26:54 chaining -- bdev/chaining.sh@242 -- # killprocess 1194933 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@949 -- # '[' -z 1194933 ']' 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@953 -- # kill -0 1194933 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@954 -- # uname 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1194933 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1194933' 00:31:32.671 killing process with pid 1194933 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@968 -- # kill 1194933 00:31:32.671 Received shutdown signal, test time was about 5.000000 seconds 00:31:32.671 00:31:32.671 Latency(us) 00:31:32.671 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:32.671 =================================================================================================================== 00:31:32.671 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:32.671 10:26:54 chaining -- common/autotest_common.sh@973 -- # wait 1194933 00:31:32.930 10:26:54 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@117 -- # sync 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@120 -- # set +e 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:31:32.930 rmmod nvme_tcp 00:31:32.930 rmmod nvme_fabrics 00:31:32.930 rmmod nvme_keyring 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@124 -- # set -e 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@125 -- # return 0 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@489 -- # '[' -n 1193392 ']' 00:31:32.930 10:26:54 chaining -- nvmf/common.sh@490 -- # killprocess 1193392 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@949 -- # '[' -z 1193392 ']' 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@953 -- # kill -0 1193392 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@954 -- # uname 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1193392 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1193392' 00:31:32.930 killing process with pid 1193392 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@968 -- # kill 1193392 00:31:32.930 10:26:54 chaining -- common/autotest_common.sh@973 -- # wait 1193392 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:31:33.191 10:26:54 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:33.191 10:26:54 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:33.191 10:26:54 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:35.103 10:26:56 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:31:35.103 10:26:56 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:31:35.103 00:31:35.103 real 0m47.752s 00:31:35.103 user 0m57.455s 00:31:35.103 sys 0m10.965s 00:31:35.103 10:26:56 chaining -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:35.103 10:26:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:35.103 ************************************ 00:31:35.103 END TEST chaining 00:31:35.103 ************************************ 00:31:35.363 10:26:56 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:31:35.363 10:26:56 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:31:35.363 10:26:56 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:31:35.363 10:26:56 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:31:35.363 10:26:56 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:31:35.363 10:26:56 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:31:35.363 10:26:56 -- common/autotest_common.sh@723 -- # xtrace_disable 00:31:35.363 10:26:56 -- common/autotest_common.sh@10 -- # set +x 00:31:35.363 10:26:57 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:31:35.363 10:26:57 -- common/autotest_common.sh@1391 -- # local autotest_es=0 00:31:35.363 10:26:57 -- common/autotest_common.sh@1392 -- # xtrace_disable 00:31:35.363 10:26:57 -- common/autotest_common.sh@10 -- # set +x 00:31:41.952 INFO: APP EXITING 00:31:41.952 INFO: killing all VMs 00:31:41.952 INFO: killing vhost app 00:31:41.952 WARN: no vhost pid file found 00:31:41.952 INFO: EXIT DONE 00:31:46.162 Waiting for block devices as requested 00:31:46.162 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:31:46.162 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:31:46.162 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:31:46.162 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:31:46.162 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:31:46.162 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:31:46.424 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:31:46.424 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:31:46.424 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:31:46.685 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:31:46.685 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:31:46.685 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:31:46.946 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:31:46.946 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:31:46.946 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:31:47.207 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:31:47.207 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:31:51.411 Cleaning 00:31:51.411 Removing: /var/run/dpdk/spdk0/config 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:31:51.411 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:31:51.412 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:31:51.412 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:31:51.412 Removing: /var/run/dpdk/spdk0/hugepage_info 00:31:51.412 Removing: /dev/shm/nvmf_trace.0 00:31:51.412 Removing: /dev/shm/spdk_tgt_trace.pid910269 00:31:51.672 Removing: /var/run/dpdk/spdk0 00:31:51.672 Removing: /var/run/dpdk/spdk_pid1001976 00:31:51.672 Removing: /var/run/dpdk/spdk_pid1004387 00:31:51.672 Removing: /var/run/dpdk/spdk_pid1005343 00:31:51.672 Removing: /var/run/dpdk/spdk_pid1015609 00:31:51.672 Removing: /var/run/dpdk/spdk_pid1019133 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1020170 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1032112 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1034734 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1035933 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1047437 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1050081 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1051227 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1062651 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1067342 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1068435 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1069495 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1072768 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1078405 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1081303 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1086432 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1090356 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1096621 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1100189 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1106966 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1109452 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1115973 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1118436 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1124963 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1127429 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1132276 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1132740 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1133421 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1133980 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1134646 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1135339 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1136240 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1136791 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1138855 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1140851 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1142974 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1144739 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1146751 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1148896 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1151055 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1152674 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1153311 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1153909 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1156119 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1158586 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1160875 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1162119 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1163758 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1164389 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1164538 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1164725 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1165249 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1165360 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1166687 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1168670 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1170654 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1171584 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1172517 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1172835 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1172862 00:31:51.673 Removing: /var/run/dpdk/spdk_pid1172897 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1174159 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1174786 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1175166 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1177494 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1179800 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1182117 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1183522 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1184861 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1185485 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1185529 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1190042 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1190165 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1190406 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1190514 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1190738 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1191222 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1192345 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1193704 00:31:51.933 Removing: /var/run/dpdk/spdk_pid1194933 00:31:51.933 Removing: /var/run/dpdk/spdk_pid905698 00:31:51.933 Removing: /var/run/dpdk/spdk_pid908053 00:31:51.933 Removing: /var/run/dpdk/spdk_pid910269 00:31:51.934 Removing: /var/run/dpdk/spdk_pid910762 00:31:51.934 Removing: /var/run/dpdk/spdk_pid911705 00:31:51.934 Removing: /var/run/dpdk/spdk_pid912011 00:31:51.934 Removing: /var/run/dpdk/spdk_pid912980 00:31:51.934 Removing: /var/run/dpdk/spdk_pid913096 00:31:51.934 Removing: /var/run/dpdk/spdk_pid913405 00:31:51.934 Removing: /var/run/dpdk/spdk_pid916625 00:31:51.934 Removing: /var/run/dpdk/spdk_pid918578 00:31:51.934 Removing: /var/run/dpdk/spdk_pid918927 00:31:51.934 Removing: /var/run/dpdk/spdk_pid919290 00:31:51.934 Removing: /var/run/dpdk/spdk_pid919602 00:31:51.934 Removing: /var/run/dpdk/spdk_pid919828 00:31:51.934 Removing: /var/run/dpdk/spdk_pid920061 00:31:51.934 Removing: /var/run/dpdk/spdk_pid920380 00:31:51.934 Removing: /var/run/dpdk/spdk_pid920733 00:31:51.934 Removing: /var/run/dpdk/spdk_pid921699 00:31:51.934 Removing: /var/run/dpdk/spdk_pid924768 00:31:51.934 Removing: /var/run/dpdk/spdk_pid925154 00:31:51.934 Removing: /var/run/dpdk/spdk_pid925515 00:31:51.934 Removing: /var/run/dpdk/spdk_pid925688 00:31:51.934 Removing: /var/run/dpdk/spdk_pid925878 00:31:51.934 Removing: /var/run/dpdk/spdk_pid925949 00:31:51.934 Removing: /var/run/dpdk/spdk_pid926267 00:31:51.934 Removing: /var/run/dpdk/spdk_pid926519 00:31:51.934 Removing: /var/run/dpdk/spdk_pid926941 00:31:51.934 Removing: /var/run/dpdk/spdk_pid927394 00:31:51.934 Removing: /var/run/dpdk/spdk_pid927716 00:31:51.934 Removing: /var/run/dpdk/spdk_pid928031 00:31:51.934 Removing: /var/run/dpdk/spdk_pid928164 00:31:51.934 Removing: /var/run/dpdk/spdk_pid928394 00:31:51.934 Removing: /var/run/dpdk/spdk_pid928706 00:31:51.934 Removing: /var/run/dpdk/spdk_pid929029 00:31:51.934 Removing: /var/run/dpdk/spdk_pid929242 00:31:51.934 Removing: /var/run/dpdk/spdk_pid929399 00:31:51.934 Removing: /var/run/dpdk/spdk_pid929705 00:31:51.934 Removing: /var/run/dpdk/spdk_pid930024 00:31:51.934 Removing: /var/run/dpdk/spdk_pid930348 00:31:51.934 Removing: /var/run/dpdk/spdk_pid930459 00:31:51.934 Removing: /var/run/dpdk/spdk_pid930703 00:31:51.934 Removing: /var/run/dpdk/spdk_pid931035 00:31:51.934 Removing: /var/run/dpdk/spdk_pid931350 00:31:51.934 Removing: /var/run/dpdk/spdk_pid931595 00:31:52.195 Removing: /var/run/dpdk/spdk_pid931720 00:31:52.195 Removing: /var/run/dpdk/spdk_pid932046 00:31:52.195 Removing: /var/run/dpdk/spdk_pid932365 00:31:52.195 Removing: /var/run/dpdk/spdk_pid932692 00:31:52.195 Removing: /var/run/dpdk/spdk_pid933015 00:31:52.195 Removing: /var/run/dpdk/spdk_pid933339 00:31:52.195 Removing: /var/run/dpdk/spdk_pid933659 00:31:52.195 Removing: /var/run/dpdk/spdk_pid933984 00:31:52.195 Removing: /var/run/dpdk/spdk_pid934226 00:31:52.195 Removing: /var/run/dpdk/spdk_pid934501 00:31:52.195 Removing: /var/run/dpdk/spdk_pid934925 00:31:52.195 Removing: /var/run/dpdk/spdk_pid935277 00:31:52.195 Removing: /var/run/dpdk/spdk_pid935502 00:31:52.195 Removing: /var/run/dpdk/spdk_pid939504 00:31:52.195 Removing: /var/run/dpdk/spdk_pid941602 00:31:52.195 Removing: /var/run/dpdk/spdk_pid943830 00:31:52.195 Removing: /var/run/dpdk/spdk_pid944775 00:31:52.195 Removing: /var/run/dpdk/spdk_pid946040 00:31:52.195 Removing: /var/run/dpdk/spdk_pid946429 00:31:52.195 Removing: /var/run/dpdk/spdk_pid946623 00:31:52.195 Removing: /var/run/dpdk/spdk_pid946678 00:31:52.195 Removing: /var/run/dpdk/spdk_pid951413 00:31:52.195 Removing: /var/run/dpdk/spdk_pid952014 00:31:52.195 Removing: /var/run/dpdk/spdk_pid953251 00:31:52.195 Removing: /var/run/dpdk/spdk_pid953558 00:31:52.195 Removing: /var/run/dpdk/spdk_pid959459 00:31:52.195 Removing: /var/run/dpdk/spdk_pid961555 00:31:52.195 Removing: /var/run/dpdk/spdk_pid962570 00:31:52.195 Removing: /var/run/dpdk/spdk_pid967086 00:31:52.195 Removing: /var/run/dpdk/spdk_pid968836 00:31:52.195 Removing: /var/run/dpdk/spdk_pid969850 00:31:52.195 Removing: /var/run/dpdk/spdk_pid974347 00:31:52.195 Removing: /var/run/dpdk/spdk_pid976856 00:31:52.195 Removing: /var/run/dpdk/spdk_pid977868 00:31:52.195 Removing: /var/run/dpdk/spdk_pid988065 00:31:52.195 Removing: /var/run/dpdk/spdk_pid990201 00:31:52.195 Removing: /var/run/dpdk/spdk_pid991281 00:31:52.195 Clean 00:31:52.195 10:27:14 -- common/autotest_common.sh@1450 -- # return 0 00:31:52.455 10:27:14 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:31:52.455 10:27:14 -- common/autotest_common.sh@729 -- # xtrace_disable 00:31:52.455 10:27:14 -- common/autotest_common.sh@10 -- # set +x 00:31:52.455 10:27:14 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:31:52.455 10:27:14 -- common/autotest_common.sh@729 -- # xtrace_disable 00:31:52.455 10:27:14 -- common/autotest_common.sh@10 -- # set +x 00:31:52.455 10:27:14 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:31:52.455 10:27:14 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:31:52.455 10:27:14 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:31:52.455 10:27:14 -- spdk/autotest.sh@391 -- # hash lcov 00:31:52.455 10:27:14 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:31:52.455 10:27:14 -- spdk/autotest.sh@393 -- # hostname 00:31:52.455 10:27:14 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-CYP-06 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:31:52.716 geninfo: WARNING: invalid characters removed from testname! 00:32:14.674 10:27:35 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:17.212 10:27:38 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:19.147 10:27:40 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:21.154 10:27:42 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:23.067 10:27:44 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:25.615 10:27:46 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:27.528 10:27:48 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:32:27.528 10:27:48 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:32:27.528 10:27:48 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:32:27.528 10:27:48 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:32:27.528 10:27:49 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:32:27.528 10:27:49 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:27.529 10:27:49 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:27.529 10:27:49 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:27.529 10:27:49 -- paths/export.sh@5 -- $ export PATH 00:32:27.529 10:27:49 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:27.529 10:27:49 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:27.529 10:27:49 -- common/autobuild_common.sh@437 -- $ date +%s 00:32:27.529 10:27:49 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718008069.XXXXXX 00:32:27.529 10:27:49 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718008069.78VJoB 00:32:27.529 10:27:49 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:32:27.529 10:27:49 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:32:27.529 10:27:49 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:32:27.529 10:27:49 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:32:27.529 10:27:49 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:32:27.529 10:27:49 -- common/autobuild_common.sh@453 -- $ get_config_params 00:32:27.529 10:27:49 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:32:27.529 10:27:49 -- common/autotest_common.sh@10 -- $ set +x 00:32:27.529 10:27:49 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:32:27.529 10:27:49 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:32:27.529 10:27:49 -- pm/common@17 -- $ local monitor 00:32:27.529 10:27:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:27.529 10:27:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:27.529 10:27:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:27.529 10:27:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:27.529 10:27:49 -- pm/common@25 -- $ sleep 1 00:32:27.529 10:27:49 -- pm/common@21 -- $ date +%s 00:32:27.529 10:27:49 -- pm/common@21 -- $ date +%s 00:32:27.529 10:27:49 -- pm/common@21 -- $ date +%s 00:32:27.529 10:27:49 -- pm/common@21 -- $ date +%s 00:32:27.529 10:27:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718008069 00:32:27.529 10:27:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718008069 00:32:27.529 10:27:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718008069 00:32:27.529 10:27:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718008069 00:32:27.529 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718008069_collect-cpu-temp.pm.log 00:32:27.529 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718008069_collect-vmstat.pm.log 00:32:27.529 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718008069_collect-cpu-load.pm.log 00:32:27.529 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718008069_collect-bmc-pm.bmc.pm.log 00:32:28.472 10:27:50 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:32:28.472 10:27:50 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j128 00:32:28.472 10:27:50 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:28.472 10:27:50 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:32:28.472 10:27:50 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:32:28.472 10:27:50 -- spdk/autopackage.sh@19 -- $ timing_finish 00:32:28.472 10:27:50 -- common/autotest_common.sh@735 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:32:28.473 10:27:50 -- common/autotest_common.sh@736 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:32:28.473 10:27:50 -- common/autotest_common.sh@738 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:32:28.473 10:27:50 -- spdk/autopackage.sh@20 -- $ exit 0 00:32:28.473 10:27:50 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:32:28.473 10:27:50 -- pm/common@29 -- $ signal_monitor_resources TERM 00:32:28.473 10:27:50 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:32:28.473 10:27:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:28.473 10:27:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:32:28.473 10:27:50 -- pm/common@44 -- $ pid=1208325 00:32:28.473 10:27:50 -- pm/common@50 -- $ kill -TERM 1208325 00:32:28.473 10:27:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:28.473 10:27:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:32:28.473 10:27:50 -- pm/common@44 -- $ pid=1208326 00:32:28.473 10:27:50 -- pm/common@50 -- $ kill -TERM 1208326 00:32:28.473 10:27:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:28.473 10:27:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:32:28.473 10:27:50 -- pm/common@44 -- $ pid=1208327 00:32:28.473 10:27:50 -- pm/common@50 -- $ kill -TERM 1208327 00:32:28.473 10:27:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:28.473 10:27:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:32:28.473 10:27:50 -- pm/common@44 -- $ pid=1208351 00:32:28.473 10:27:50 -- pm/common@50 -- $ sudo -E kill -TERM 1208351 00:32:28.473 + [[ -n 780154 ]] 00:32:28.473 + sudo kill 780154 00:32:28.484 [Pipeline] } 00:32:28.503 [Pipeline] // stage 00:32:28.509 [Pipeline] } 00:32:28.527 [Pipeline] // timeout 00:32:28.533 [Pipeline] } 00:32:28.552 [Pipeline] // catchError 00:32:28.557 [Pipeline] } 00:32:28.574 [Pipeline] // wrap 00:32:28.580 [Pipeline] } 00:32:28.594 [Pipeline] // catchError 00:32:28.601 [Pipeline] stage 00:32:28.603 [Pipeline] { (Epilogue) 00:32:28.615 [Pipeline] catchError 00:32:28.617 [Pipeline] { 00:32:28.630 [Pipeline] echo 00:32:28.632 Cleanup processes 00:32:28.638 [Pipeline] sh 00:32:28.931 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:28.931 1208443 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:32:28.931 1208825 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:28.947 [Pipeline] sh 00:32:29.237 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:29.237 ++ grep -v 'sudo pgrep' 00:32:29.237 ++ awk '{print $1}' 00:32:29.237 + sudo kill -9 1208443 00:32:29.251 [Pipeline] sh 00:32:29.539 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:32:41.788 [Pipeline] sh 00:32:42.076 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:32:42.076 Artifacts sizes are good 00:32:42.090 [Pipeline] archiveArtifacts 00:32:42.100 Archiving artifacts 00:32:42.284 [Pipeline] sh 00:32:42.574 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:32:42.589 [Pipeline] cleanWs 00:32:42.600 [WS-CLEANUP] Deleting project workspace... 00:32:42.600 [WS-CLEANUP] Deferred wipeout is used... 00:32:42.608 [WS-CLEANUP] done 00:32:42.610 [Pipeline] } 00:32:42.630 [Pipeline] // catchError 00:32:42.643 [Pipeline] sh 00:32:42.930 + logger -p user.info -t JENKINS-CI 00:32:42.939 [Pipeline] } 00:32:42.952 [Pipeline] // stage 00:32:42.956 [Pipeline] } 00:32:42.971 [Pipeline] // node 00:32:42.976 [Pipeline] End of Pipeline 00:32:43.069 Finished: SUCCESS